Sample records for tool list model

  1. Data and Tools | NREL

    Science.gov Websites

    Data and Tools Data and Tools NREL develops data sets, maps, models, and tools for the analysis of , models, and tools in the alphabetical listing. Popular Resources PVWatts Calculator Geospatial Data

  2. Results from a workshop on research needs for modeling aquifer thermal energy storage systems

    NASA Astrophysics Data System (ADS)

    Drost, M. K.

    1990-08-01

    A workshop an aquifer thermal energy storage (ATES) system modeling was conducted by Pacific Northwest Laboratory (PNL). The goal of the workshop was to develop a list of high priority research activities that would facilitate the commercial success of ATES. During the workshop, participants reviewed currently available modeling tools for ATES systems and produced a list of significant issues related to modeling ATES systems. Participants assigned a priority to each issue on the list by voting and developed a list of research needs for each of four high-priority research areas; the need for a feasibility study model, the need for engineering design models, the need for aquifer characterization, and the need for an economic model. The workshop participants concluded that ATES commercialization can be accelerated by aggressive development of ATES modeling tools and made specific recommendations for that development.

  3. Requirements for clinical information modelling tools.

    PubMed

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Great Lakes/St. Lawrence Seaway Regional Transportation Study: Documentation of the Lock Capacity Model Used in the Feasibility Analysis of GL/SLS Capacity Expansion Measures to the Year 2050.

    DTIC Science & Technology

    1981-05-01

    The Appendices contain a listing of the program variables, a program listing, a data file listing and a sample o ’ tput listing.- DD N 1473 EDITION OF...1.1 Objectives The specific objectives of this study were to develop a GL/SLS LOCK CAPACITY MODEL to be used as a planning tool to deter- mine if, or...by the Corps of Engineers as a planning tool to determine when lock capacity is reached for the Soo, Welland, and St. Lawrence River lock systems and

  5. NREL: Renewable Resource Data Center - Geothermal Resource Models and Tools

    Science.gov Websites

    allow users to determine locations that are favorable to geothermal energy development. List of software Models and Tools The Renewable Resource Data Center (RReDC) features the following geothermal models and tools. Geothermal Prospector The Geothermal Prospector tool provides the information needed to

  6. A Predictive Logistic Regression Model of World Conflict Using Open Source Data

    DTIC Science & Technology

    2015-03-26

    Added to the United Nations list are Palestine (West Bank and Gaza) and Kosovo. The total number of modeled nations is 182. Not all of these...The 26 variables are listed in Table 4. Also listed in Table 4 are the year the dataset was first collected, the data lag and the number of nation...state of violent conflict in 2015, seventeen of them are new to conflict since the last published list in 2013. A prediction tool is created to allow

  7. Lives Saved Tool (LiST) costing: a module to examine costs and prioritize interventions.

    PubMed

    Bollinger, Lori A; Sanders, Rachel; Winfrey, William; Adesina, Adebiyi

    2017-11-07

    Achieving the Sustainable Development Goals will require careful allocation of resources in order to achieve the highest impact. The Lives Saved Tool (LiST) has been used widely to calculate the impact of maternal, neonatal and child health (MNCH) interventions for program planning and multi-country estimation in several Lancet Series commissions. As use of the LiST model increases, many have expressed a desire to cost interventions within the model, in order to support budgeting and prioritization of interventions by countries. A limited LiST costing module was introduced several years ago, but with gaps in cost types. Updates to inputs have now been added to make the module fully functional for a range of uses. This paper builds on previous work that developed an initial version of the LiST costing module to provide costs for MNCH interventions using an ingredients-based costing approach. Here, we update in 2016 the previous econometric estimates from 2013 with newly-available data and also include above-facility level costs such as program management. The updated econometric estimates inform percentages of intervention-level costs for some direct costs and indirect costs. These estimates add to existing values for direct cost requirements for items such as drugs and supplies and required provider time which were already available in LiST Costing. Results generated by the LiST costing module include costs for each intervention, as well as disaggregated costs by intervention including drug and supply costs, labor costs, other recurrent costs, capital costs, and above-service delivery costs. These results can be combined with mortality estimates to support prioritization of interventions by countries. The LiST costing module provides an option for countries to identify resource requirements for scaling up a maternal, neonatal, and child health program, and to examine the financial impact of different resource allocation strategies. It can be a useful tool for countries as they seek to identify the best investments for scarce resources. The purpose of the LiST model is to provide a tool to make resource allocation decisions in a strategic planning process through prioritizing interventions based on resulting impact on maternal and child mortality and morbidity.

  8. Data and Tools - Alphabetical Listing | NREL

    Science.gov Websites

    Climate Action Planning Tool Community Solar Scenario Tool Comparative PV Levelized Cost of Energy (LCOE Design Response Toolbox WEC-Sim: Wave Energy Converter Simulator West Associates Solar Monitoring Network Design and Engineering Model

  9. Comparison of Lives Saved Tool model child mortality estimates against measured data from vector control studies in sub-Saharan Africa

    PubMed Central

    2011-01-01

    Background Insecticide-treated mosquito nets (ITNs) and indoor-residual spraying have been scaled-up across sub-Saharan Africa as part of international efforts to control malaria. These interventions have the potential to significantly impact child survival. The Lives Saved Tool (LiST) was developed to provide national and regional estimates of cause-specific mortality based on the extent of intervention coverage scale-up. We compared the percent reduction in all-cause child mortality estimated by LiST against measured reductions in all-cause child mortality from studies assessing the impact of vector control interventions in Africa. Methods We performed a literature search for appropriate studies and compared reductions in all-cause child mortality estimated by LiST to 4 studies that estimated changes in all-cause child mortality following the scale-up of vector control interventions. The following key parameters measured by each study were applied to available country projections: baseline all-cause child mortality rate, proportion of mortality due to malaria, and population coverage of vector control interventions at baseline and follow-up years. Results The percent reduction in all-cause child mortality estimated by the LiST model fell within the confidence intervals around the measured mortality reductions for all 4 studies. Two of the LiST estimates overestimated the mortality reductions by 6.1 and 4.2 percentage points (33% and 35% relative to the measured estimates), while two underestimated the mortality reductions by 4.7 and 6.2 percentage points (22% and 25% relative to the measured estimates). Conclusions The LiST model did not systematically under- or overestimate the impact of ITNs on all-cause child mortality. These results show the LiST model to perform reasonably well at estimating the effect of vector control scale-up on child mortality when compared against measured data from studies across a range of malaria transmission settings. The LiST model appears to be a useful tool in estimating the potential mortality reduction achieved from scaling-up malaria control interventions. PMID:21501453

  10. 78 FR 6269 - Amendment to the International Traffic in Arms Regulations: Revision of U.S. Munitions List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... remain subject to USML control are modeling or simulation tools that model or simulate the environments... USML revision process, the public is asked to provide specific examples of nuclear-related items whose...) Modeling or simulation tools that model or simulate the environments generated by nuclear detonations or...

  11. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  12. Comparison of LiST measles mortality model and WHO/IVB measles model.

    PubMed

    Chen, Wei-Ju

    2011-04-13

    The Lives Saved Tool (LiST) has been developed to estimate the impact of health interventions and can consider multiple interventions simultaneously. Given its increasing usage by donor organizations and national program planner, we compare the LiST measles model to the widely used World Health Organization's Department of Immunization, Vaccines and Biologicals (WHO/IVB) measles model which is used to produce estimates serving as a major indicator of monitoring country measles epidemics and the progress of measles control. We analyzed the WHO/IVB models and the LiST measles model and identified components and assumptions held in each model. We contrasted the important components, and compared results from the two models by applying historical measles containing vaccine (MCV) coverages and the default values of all parameters set in the models. We also conducted analyses following a hypothetical scenario to understand how both models performed when the proportion of population protected by MCV declined to zero percent in short time period. The WHO/IVB measles model and the LiST measles model structures differ: the former is a mixed model which applies surveillance data adjusted for reporting completeness for countries with good disease surveillance system and applies a natural history model for countries with poorer disease control program and surveillance system, and the latter is a cohort model incorporating country-specific cause-of-death (CoD) profiles among children under-five. The trends of estimates of the two models are similar, but the estimates of the first year are different in most of the countries included in the analysis. The two models are comparable if we adjust the measles CoD in the LiST to produce the same baseline estimates. In addition, we used the models to estimate the potential impact of stopping using measles vaccine over a 7-year period. The WHO/IVB model produced similar estimates to the LiST model with adjusted CoD. But the LiST model produced low estimates for countries with very low or eliminated measles infection that may be inappropriate. The study presents methodological and quantitative comparisons between the WHO/IVB and the LiST measles models that highlights differences in model structures and may help users to better interpret and contrast estimates of the measles death from the two models. The major differences are resulted from the usage of case-fatality rate (CFR) in the WHO/IVB model and the CoD profile in the LiST. Both models have their own advantages and limitations. Users should be aware of the issue and apply as update country parameters as possible. Advanced models are expected to validate the policy-planning tools in the future.

  13. Comparison of LiST measles mortality model and WHO/IVB measles model

    PubMed Central

    2011-01-01

    Background The Lives Saved Tool (LiST) has been developed to estimate the impact of health interventions and can consider multiple interventions simultaneously. Given its increasing usage by donor organizations and national program planner, we compare the LiST measles model to the widely used World Health Organization's Department of Immunization, Vaccines and Biologicals (WHO/IVB) measles model which is used to produce estimates serving as a major indicator of monitoring country measles epidemics and the progress of measles control. Methods We analyzed the WHO/IVB models and the LiST measles model and identified components and assumptions held in each model. We contrasted the important components, and compared results from the two models by applying historical measles containing vaccine (MCV) coverages and the default values of all parameters set in the models. We also conducted analyses following a hypothetical scenario to understand how both models performed when the proportion of population protected by MCV declined to zero percent in short time period. Results The WHO/IVB measles model and the LiST measles model structures differ: the former is a mixed model which applies surveillance data adjusted for reporting completeness for countries with good disease surveillance system and applies a natural history model for countries with poorer disease control program and surveillance system, and the latter is a cohort model incorporating country-specific cause-of-death (CoD) profiles among children under-five. The trends of estimates of the two models are similar, but the estimates of the first year are different in most of the countries included in the analysis. The two models are comparable if we adjust the measles CoD in the LiST to produce the same baseline estimates. In addition, we used the models to estimate the potential impact of stopping using measles vaccine over a 7-year period. The WHO/IVB model produced similar estimates to the LiST model with adjusted CoD. But the LiST model produced low estimates for countries with very low or eliminated measles infection that may be inappropriate. Conclusions The study presents methodological and quantitative comparisons between the WHO/IVB and the LiST measles models that highlights differences in model structures and may help users to better interpret and contrast estimates of the measles death from the two models. The major differences are resulted from the usage of case-fatality rate (CFR) in the WHO/IVB model and the CoD profile in the LiST. Both models have their own advantages and limitations. Users should be aware of the issue and apply as update country parameters as possible. Advanced models are expected to validate the policy-planning tools in the future. PMID:21501452

  14. Urban Propagation Modeling for Wireless Systems

    DTIC Science & Technology

    2004-01-30

    STIR) GRANT DAAD19-03-1-0069 William Mark Smith Donald C. Cox January 2004 Contents Abstract 1 List of Publications 3 List of Personnel 4 Report of...29 ii 3 Experiment Configuration 31 3.1 Experiment Objectives...156 A.2.2 Software Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 A. 3 Transmitter

  15. 19 CFR Appendix to Part 163 - Interim (a)(1)(A) List

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... for commercial samples, tools, theatrical effects §§ 10.70, 10.71Purebred breeding certificate § 10..., merchandise (commercial product) description, quantities, values, unit price, trade terms, part, model, style... Access Program (9802/GSP/CBI) § 141.89CF 5523 Part 141Corrected Commercial Invoice 141.86 (e)Packing List...

  16. 19 CFR Appendix to Part 163 - Interim (a)(1)(A) List

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... for commercial samples, tools, theatrical effects §§ 10.70, 10.71Purebred breeding certificate § 10..., merchandise (commercial product) description, quantities, values, unit price, trade terms, part, model, style... Access Program (9802/GSP/CBI) § 141.89CF 5523 Part 141Corrected Commercial Invoice 141.86 (e)Packing List...

  17. 19 CFR Appendix to Part 163 - Interim (a)(1)(A) List

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... for commercial samples, tools, theatrical effects §§ 10.70, 10.71Purebred breeding certificate § 10..., merchandise (commercial product) description, quantities, values, unit price, trade terms, part, model, style... Access Program (9802/GSP/CBI) § 141.89CF 5523 Part 141Corrected Commercial Invoice 141.86 (e)Packing List...

  18. 19 CFR Appendix to Part 163 - Interim (a)(1)(A) List

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for commercial samples, tools, theatrical effects §§ 10.70, 10.71Purebred breeding certificate § 10..., merchandise (commercial product) description, quantities, values, unit price, trade terms, part, model, style... Access Program (9802/GSP/CBI) § 141.89CF 5523 Part 141Corrected Commercial Invoice 141.86 (e)Packing List...

  19. 19 CFR Appendix to Part 163 - Interim (a)(1)(A) List

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for commercial samples, tools, theatrical effects §§ 10.70, 10.71Purebred breeding certificate § 10..., merchandise (commercial product) description, quantities, values, unit price, trade terms, part, model, style... Access Program (9802/GSP/CBI) § 141.89CF 5523 Part 141Corrected Commercial Invoice 141.86 (e)Packing List...

  20. Using the Lives Saved Tool (LiST) to Model mHealth Impact on Neonatal Survival in Resource-Limited Settings

    PubMed Central

    Jo, Youngji; Labrique, Alain B.; Lefevre, Amnesty E.; Mehl, Garrett; Pfaff, Teresa; Walker, Neff; Friberg, Ingrid K.

    2014-01-01

    While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST)–an evidence-based modeling software–to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives. PMID:25014008

  1. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  2. PANTHER version 11: expanded annotation data from Gene Ontology and Reactome pathways, and data analysis tool enhancements.

    PubMed

    Mi, Huaiyu; Huang, Xiaosong; Muruganujan, Anushya; Tang, Haiming; Mills, Caitlin; Kang, Diane; Thomas, Paul D

    2017-01-04

    The PANTHER database (Protein ANalysis THrough Evolutionary Relationships, http://pantherdb.org) contains comprehensive information on the evolution and function of protein-coding genes from 104 completely sequenced genomes. PANTHER software tools allow users to classify new protein sequences, and to analyze gene lists obtained from large-scale genomics experiments. In the past year, major improvements include a large expansion of classification information available in PANTHER, as well as significant enhancements to the analysis tools. Protein subfamily functional classifications have more than doubled due to progress of the Gene Ontology Phylogenetic Annotation Project. For human genes (as well as a few other organisms), PANTHER now also supports enrichment analysis using pathway classifications from the Reactome resource. The gene list enrichment tools include a new 'hierarchical view' of results, enabling users to leverage the structure of the classifications/ontologies; the tools also allow users to upload genetic variant data directly, rather than requiring prior conversion to a gene list. The updated coding single-nucleotide polymorphisms (SNP) scoring tool uses an improved algorithm. The hidden Markov model (HMM) search tools now use HMMER3, dramatically reducing search times and improving accuracy of E-value statistics. Finally, the PANTHER Tree-Attribute Viewer has been implemented in JavaScript, with new views for exploring protein sequence evolution. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Effectiveness and Adoption of a Drawing-to-Learn Study Tool for Recall and Problem Solving: Minute Sketches with Folded Lists

    ERIC Educational Resources Information Center

    Heideman, Paul D.; Flores, K. Adryan; Sevier, Lu M.; Trouton, Kelsey E.

    2017-01-01

    Drawing by learners can be an effective way to develop memory and generate visual models for higher-order skills in biology, but students are often reluctant to adopt drawing as a study method. We designed a nonclassroom intervention that instructed introductory biology college students in a drawing method, minute sketches in folded lists (MSFL),…

  4. Pathogen Transport and Fate Modeling in the Upper Salem River Watershed using SWAT Model

    EPA Science Inventory

    SWAT (Soil and Water Assessment Tool) is a dynamic watershed model that is applied to simulate the impact of land management practices on water quality over a continuous period. The Upper Salem River, located in Salem County New Jersey, is listed by the New Jersey Department of ...

  5. Pathogen Transport and Fate Modeling in the Upper Salem River Watershed Using SWAT Model

    EPA Science Inventory

    SWAT (Soil and Water Assessment Tool) is a dynamic watershed model that is applied to simulate the impact of land management practices on water quality over a continuous period. The Upper Salem River, located in Salem County New Jersey, is listed by the New Jersey Department of ...

  6. NetList(+): A simple interface language for chip design

    NASA Astrophysics Data System (ADS)

    Wuu, Tzyh-Yung

    1991-04-01

    NetList (+) is a design specification language developed at MOSIS for rapid turn-around cell-based ASIC prototyping. By using NetList (+), a uniform representation is achieved for the specification, simulation, and physical description of a design. The goal is to establish an interfacing methodology between design specification and independent computer aided design tools. Designers need only to specify a system by writing a corresponding netlist. This netlist is used for both functional simulation and timing simulation. The same netlist is also used to derive the low level physical tools to generate layout. Another goal of using NetList (+) is to generate parts of a design by running it through different kinds of placement and routing (P and R) tools. For example some parts of a design will be generated by standard cell P and R tools. Other parts may be generated by a layout tiler; i.e., datapath compiler, RAM/ROM generator, or PLA generator. Finally all different parts of a design can be integrated by general block P and R tools as a single chip. The NetList (+) language can actually act as an interface among tools. Section 2 shows a flowchart to illustrate the NetList (+) system and its relation with other related design tools. Section 3 shows how to write a NetList (+) description from the block diagram of a circuit. In section 4 discusses how to prepare a cell library or several cell libraries for a design system. Section 5 gives a few designs by NetList (+) and shows their simulation and layout results.

  7. The Model of Career Anchors as a Tool in the Analysis of Instructional Developers.

    ERIC Educational Resources Information Center

    Miller, Carol

    1981-01-01

    Examines the importance of human systems as a relevant aspect of development processes and looks at the career anchor model proposed by Schein as a possible area in the analysis of the instructional developer/client relationships. Fourteen references are listed. (Author/LLS)

  8. National Centers for Environmental Prediction

    Science.gov Websites

    Reference List Table of Contents NCEP OPERATIONAL MODEL FORECAST GRAPHICS PARALLEL/EXPERIMENTAL MODEL Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS VERIFICATION (GRID VS.OBS) WEB PAGE (NCEP EXPERIMENTAL PAGE, INTERNAL USE ONLY) Interactive web page tool for

  9. Interactive Macroeconomics

    NASA Astrophysics Data System (ADS)

    Di Guilmi, Corrado; Gallegati, Mauro; Landini, Simone

    2017-04-01

    Preface; List of tables; List of figures, 1. Introduction; Part I. Methodological Notes and Tools: 2. The state space notion; 3. The master equation; Part II. Applications to HIA Based Models: 4. Financial fragility and macroeconomic dynamics I: heterogeneity and interaction; 5. Financial fragility and macroeconomic Dynamics II: learning; Part III. Conclusions: 6. Conclusive remarks; Part IV. Appendices and Complements: Appendix A: Complements to Chapter 3; Appendix B: Solving the ME to solve the ABM; Appendix C: Specifying transition rates; Index.

  10. The effect of Haemophilus influenzae type b and pneumococcal conjugate vaccines on childhood pneumonia incidence, severe morbidity and mortality.

    PubMed

    Theodoratou, Evropi; Johnson, Sue; Jhass, Arnoupe; Madhi, Shabir A; Clark, Andrew; Boschi-Pinto, Cynthia; Bhopal, Sunil; Rudan, Igor; Campbell, Harry

    2010-04-01

    With the aim of populating the Lives Saved Tool (LiST) with parameters of effectiveness of existing interventions, we conducted a systematic review of the literature assessing the effect of Haemophilus influenzae type b (Hib) and pneumococcal (PC) conjugate vaccines on incidence, severe morbidity and mortality from childhood pneumonia. We summarized cluster randomized controlled trials (cRCTs) and case-control studies of Hib conjugate vaccines and RCTs of 9- and 11-valent PC conjugate vaccines conducted in developing countries across outcome measures using standard meta-analysis methods. We used a set of standardized rules developed for the purpose of populating the LiST tool with required parameters to promote comparability across reviews of interventions against the major causes of childhood mortality. The estimates could be adjusted further to account for factors such as PC vaccine serotype content, PC serotype distribution and human immunodeficiency virus prevalence but this was not included as part of the LiST model approach. The available evidence from published data points to a summary effect of the Hib conjugate vaccine on clinical pneumonia of 4%, on clinical severe pneumonia of 6% and on radiologically confirmed pneumonia of 18%. Respective effectiveness estimates for PC vaccines (all valent) on clinical pneumonia is 7%, clinical severe pneumonia is 7% and radiologically confirmed pneumonia is 26%. The findings indicated that radiologically confirmed pneumonia, as a severe morbidity proxy for mortality, provided better estimates for the LiST model of effect of interventions on mortality reduction than did other outcomes evaluated. The LiST model will use this to estimate the pneumonia mortality reduction which might be observed when scaling up Hib and PC conjugate vaccination in the context of an overall package of child health interventions.

  11. Classification of processes involved in sharing individual participant data from clinical trials.

    PubMed

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.

  12. Classification of processes involved in sharing individual participant data from clinical trials

    PubMed Central

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192

  13. red - an R package to facilitate species red list assessments according to the IUCN criteria

    PubMed Central

    2017-01-01

    Abstract The International Union for the Conservation of Nature Red List is the most useful database of species that are at risk of extinction worldwide, as it relies on a number of objective criteria and is now widely adopted. The R package red – IUCN Redlisting Tools - performs a number of spatial analyses based on either observed occurrences or estimated ranges. Functions include calculating Extent of Occurrence (EOO), Area of Occupancy (AOO), mapping species ranges, species distribution modelling using climate and land cover and calculating the Red List Index for groups of species. The package allows the calculation of confidence limits for all measures. Spatial data of species occurrences, environmental or land cover variables can be either given by the user or automatically extracted from several online databases. It outputs geographical range, elevation and country values, maps in several formats and vectorial data for visualization in Google Earth. Several examples are shown demonstrating the usefulness of the different methods. The red package constitutes an open platform for further development of new tools to facilitate red list assessments. PMID:29104439

  14. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  15. A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles

    NASA Technical Reports Server (NTRS)

    Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.

    2015-01-01

    Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.

  16. Tool and Task Analysis Guide for Vocational Welding (150 Tasks). Performance Based Vocational Education.

    ERIC Educational Resources Information Center

    John H. Hinds Area Vocational School, Elwood, IN.

    This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…

  17. Assessing the risks of pesticides to threatened and endangered species using population modeling: A critical review and recommendations for future work.

    PubMed

    Forbes, Valery E; Galic, Nika; Schmolke, Amelie; Vavra, Janna; Pastorok, Rob; Thorbek, Pernille

    2016-08-01

    United States legislation requires the US Environmental Protection Agency to ensure that pesticide use does not cause unreasonable adverse effects on the environment, including species listed under the Endangered Species Act (ESA; hereafter referred to as listed species). Despite a long history of population models used in conservation biology and resource management and a 2013 report from the US National Research Council recommending their use, application of population models for pesticide risk assessments under the ESA has been minimal. The pertinent literature published from 2004 to 2014 was reviewed to explore the availability of population models and their frequency of use in listed species risk assessments. The models were categorized in terms of structure, taxonomic coverage, purpose, inputs and outputs, and whether the models included density dependence, stochasticity, or risk estimates, or were spatially explicit. Despite the widespread availability of models and an extensive literature documenting their use in other management contexts, only 2 of the approximately 400 studies reviewed used population models to assess the risks of pesticides to listed species. This result suggests that there is an untapped potential to adapt existing models for pesticide risk assessments under the ESA, but also that there are some challenges to do so for listed species. Key conclusions from the analysis are summarized, and priorities are recommended for future work to increase the usefulness of population models as tools for pesticide risk assessments. Environ Toxicol Chem 2016;35:1904-1913. © 2016 SETAC. © 2016 SETAC.

  18. Modeling the Impact of Nutrition Interventions on Birth Outcomes in the Lives Saved Tool (LiST).

    PubMed

    Heidkamp, Rebecca; Clermont, Adrienne; Phillips, Erica

    2017-11-01

    Background: Negative birth outcomes [small-for-gestational age (SGA) and preterm birth (PTB)] are common in low- and middle-income countries and have important subsequent health and developmental impacts on children. There are numerous nutritional and non-nutritional interventions that can decrease the risk of negative birth outcomes and reduce subsequent risk of mortality and growth faltering. Objective: The objective of this article was to review the current evidence for the impact of nutritional interventions in pregnancy [calcium supplementation, iron and folic acid supplementation, multiple micronutrient (MMN) supplementation, and balanced energy supplementation (BES)] and risk factors (maternal anemia) on birth outcomes, with the specific goal of determining which intervention-outcome linkages should be included in the Lives Saved Tool (LiST) software. Methods: A literature search was conducted by using the WHO e-Library of Evidence for Nutrition Actions as the starting point. Recent studies, meta-analyses, and systematic reviews were reviewed for inclusion on the basis of their relevance to LiST. Results: On the basis of the available scientific evidence, the following linkages were found to be supported for inclusion in LiST: calcium supplementation on PTB (12% reduction), MMN supplementation on SGA (9% reduction), and BES on SGA (21% reduction among food-insecure women). Conclusions: The inclusion of these linkages in LiST will improve the utility of the model for users who seek to estimate the impact of antenatal nutrition interventions on birth outcomes. Scaling up these interventions should lead to downstream impacts in reducing stunting and child mortality. © 2017 American Society for Nutrition.

  19. Trained student pharmacists' telephonic collection of patient medication information: Evaluation of a structured interview tool.

    PubMed

    Margolis, Amanda R; Martin, Beth A; Mott, David A

    2016-01-01

    To determine the feasibility and fidelity of student pharmacists collecting patient medication list information using a structured interview tool and the accuracy of documenting the information. The medication lists were used by a community pharmacist to provide a targeted medication therapy management (MTM) intervention. Descriptive analysis of patient medication lists collected with telephone interviews. Ten trained student pharmacists collected the medication lists. Trained student pharmacists conducted audio-recorded telephone interviews with 80 English-speaking, community-dwelling older adults using a structured interview tool to collect and document medication lists. Feasibility was measured using the number of completed interviews, the time student pharmacists took to collect the information, and pharmacist feedback. Fidelity to the interview tool was measured by assessing student pharmacists' adherence to asking all scripted questions and probes. Accuracy was measured by comparing the audio-recorded interviews to the medication list information documented in an electronic medical record. On average, it took student pharmacists 26.7 minutes to collect the medication lists. The community pharmacist said the medication lists were complete and that having the medication lists saved time and allowed him to focus on assessment, recommendations, and education during the targeted MTM session. Fidelity was high, with an overall proportion of asked scripted probes of 83.75% (95% confidence interval [CI], 80.62-86.88%). Accuracy was also high for both prescription (95.1%; 95% CI, 94.3-95.8%) and nonprescription (90.5%; 95% CI, 89.4-91.4%) medications. Trained student pharmacists were able to use an interview tool to collect and document medication lists with a high degree of fidelity and accuracy. This study suggests that student pharmacists or trained technicians may be able to collect patient medication lists to facilitate MTM sessions in the community pharmacy setting. Evaluating the sustainability of using student pharmacists or trained technicians to collect medication lists is needed. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  20. Trained student pharmacists’ telephonic collection of patient medication information: Evaluation of a structured interview tool

    PubMed Central

    Margolis, Amanda R.; Martin, Beth A.; Mott, David A.

    2016-01-01

    Objective To determine the feasibility and fidelity of student pharmacists collecting patient medication list information using a structured interview tool and the accuracy of documenting the information. The medication lists were used by a community pharmacist to provide a targeted medication therapy management (MTM) intervention. Design Descriptive analysis of patient medication lists collected via telephone interviews. Participants 10 trained student pharmacists collected the medication lists. Intervention Trained student pharmacists conducted audio-recorded telephone interviews with 80 English-speaking community dwelling older adults using a structured interview tool to collect and document medication lists. Main outcome measures Feasibility was measured using the number of completed interviews, the time student pharmacists took to collect the information, and pharmacist feedback. Fidelity to the interview tool was measured by assessing student pharmacists’ adherence to asking all scripted questions and probes. Accuracy was measured by comparing the audio recorded interviews to the medication list information documented in an electronic medical record. Results On average it took student pharmacists 26.7 minutes to collect the medication lists. The community pharmacist said the medication lists were complete and that having the medication lists saved time and allowed him to focus on assessment, recommendations, and education during the targeted MTM session. Fidelity was high with an overall proportion of asked scripted probes of 83.75% (95%CI: 80.62–86.88%). Accuracy was also high for both prescription (95.1%, 95%CI: 94.3–95.8%) and non-prescription (90.5%, 95%CI: 89.4–91.4%) medications. Conclusion Trained student pharmacists were able to use an interview tool to collect and document medication lists with a high degree of fidelity and accuracy. This study suggests that student pharmacists or trained technicians may be able to collect patient medication lists to facilitate MTM sessions in the community pharmacy setting. Evaluating the sustainability of using student pharmacists or trained technicians to collect medication lists is needed. PMID:27000165

  1. CLASSIFICATION FRAMEWORK FOR DIAGNOSTICS RESEARCH

    EPA Science Inventory

    The goal of Diagnostics Research is to provide tools to simplify diagnosis of the causes of biological impairment, in support of State and Tribe 303(d) impaired waters lists. The Diagnostics Workgroup has developed conceptual models for four major aquatic stressors that cause im...

  2. The Lives Saved Tool (LiST) as a Model for Prevention of Anemia in Women of Reproductive Age.

    PubMed

    Heidkamp, Rebecca; Guida, Renee; Phillips, Erica; Clermont, Adrienne

    2017-11-01

    Background: Anemia in women is a major public health burden worldwide, particularly in low- and middle-income countries (LMICs). It is a complex condition with multiple nutritional and non-nutritional causes, and geographic heterogeneity of burden. The World Health Assembly has set a target of a 50% reduction in anemia among women of reproductive age (WRA) by 2025. Objective: This article seeks to identify the leading causes of anemia among women in LMICs, review the evidence supporting interventions to address anemia in these settings, and ultimately use this information to decide which interventions should be included in the Lives Saved Tool (LiST) model of anemia. It also seeks to examine the link between anemia and cause-specific maternal mortality. Methods: The leading causes of anemia in WRA were inventoried to identify preventive and curative interventions available for implementation at the public health scale. A literature review was then conducted for each identified intervention, as well as for the link between anemia and maternal mortality. Results: The interventions for which data were available fell into the following categories: provision of iron, malaria prevention, and treatment of parasitic infestation. Ultimately, 5 interventions were included in the LiST model for anemia: blanket iron supplementation or fortification, iron and folic acid supplementation in pregnancy, multiple micronutrient supplementation in pregnancy, intermittent preventive treatment of malaria in pregnancy, and household ownership of an insecticide-treated bednet. In addition, anemia was linked in the model with risk of maternal mortality due to hemorrhage. Conclusion: The updated LiST model for anemia reflects the state of the current scientific evidence and should be of use to researchers, program managers, and policymakers who seek to model the impact of scaling up nutrition and health interventions on anemia, and ultimately on maternal mortality. © 2017 American Society for Nutrition.

  3. The impact of eliminating within-country inequality in health coverage on maternal and child mortality: a Lives Saved Tool analysis.

    PubMed

    Clermont, Adrienne

    2017-11-07

    Inequality in healthcare across population groups in low-income countries is a growing topic of interest in global health. The Lives Saved Tool (LiST), which uses health intervention coverage to model maternal, neonatal, and child health outcomes such as mortality rates, can be used to analyze the impact of within-country inequality. Data from nationally representative household surveys (98 surveys conducted between 1998 and 2014), disaggregated by wealth quintile, were used to create a LiST analysis that models the impact of scaling up health intervention coverage for the entire country from the national average to the rate of the top wealth quintile (richest 20% of the population). Interventions for which household survey data are available were used as proxies for other interventions that are not measured in surveys, based on co-delivery of intervention packages. For the 98 countries included in the analysis, 24-32% of child deaths (including 34-47% of neonatal deaths and 16-19% of post-neonatal deaths) could be prevented by scaling up national coverage of key health interventions to the level of the top wealth quintile. On average, the interventions with most unequal coverage rates across wealth quintiles were those related to childbirth in health facilities and to water and sanitation infrastructure; the most equally distributed were those delivered through community-based mass campaigns, such as vaccines, vitamin A supplementation, and bednet distribution. LiST is a powerful tool for exploring the policy and programmatic implications of within-country inequality in low-income, high-mortality-burden countries. An "Equity Tool" app has been developed within the software to make this type of analysis easily accessible to users.

  4. National Nuclear Data Center

    Science.gov Websites

    reaction data Sigma Retrieval & Plotting Nuclear structure & decay Data Nuclear Science References Experimental Unevaluated Nuclear Data List Evaluated Nuclear Structure Data File NNDC databases Ground and isomeric states properties Nuclear structure & decay data journal Nuclear reaction model code Tools and

  5. NUTRIENTS IN WATERSHEDS; DEVELOPING ENHANCED MODELING TOOLS

    EPA Science Inventory

    Nutrient enrichment is one of the most detrimental stressors causing water-resource impairment. Of systems surveyed and reported as impaired, 40% of rivers, 51% of lakes, and 57% of estuaries listed nutrients as a primary cause of impairment (USEPA, 1996). In many cases, these ...

  6. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    NASA Astrophysics Data System (ADS)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used to create customized curriculum integrating geospatial science and technology into geoscience programs.

  7. Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.

    PubMed

    Bae, Jong-Myon; Kim, Eun Hee

    2016-03-01

    The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.

  8. Machine Tool Series. Duty Task List.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This task list is intended for use in planning and/or evaluating a competency-based course to prepare machine tool, drill press, grinding machine, lathe, mill, and/or power saw operators. The listing is divided into six sections, with each one outlining the tasks required to perform the duties that have been identified for the given occupation.…

  9. Metrics for Identifying Food Security Status and the Population with Potential to Benefit from Nutrition Interventions in the Lives Saved Tool (LiST).

    PubMed

    Jackson, Bianca D; Walker, Neff; Heidkamp, Rebecca

    2017-11-01

    Background: The Lives Saved Tool (LiST) uses the poverty head-count ratio at $1.90/d as a proxy for food security to identify the percentage of the population with the potential to benefit from balanced energy supplementation and complementary feeding (CF) interventions, following the approach used for the Lancet 's 2008 series on Maternal and Child Undernutrition. Because much work has been done in the development of food security indicators, a re-evaluation of the use of this indicator was warranted. Objective: The aim was to re-evaluate the use of the poverty head-count ratio at $1.90/d as the food security proxy indicator in LiST. Methods: We carried out a desk review to identify available indicators of food security. We identified 3 indicators and compared them by using scatterplots, Spearman's correlations, and Bland-Altman plot analysis. We generated LiST projections to compare the modeled impact results with the use of the different indicators. Results: There are many food security indicators available, but only 3 additional indicators were identified with the data availability requirements to be used as the food security indicator in LiST. As expected, analyzed food security indicators were significantly positively correlated ( P < 0.001), but there was generally poor agreement between them. The disparity between the indicators also increases as the values of the indicators increase. Consequently, the choice of indicator can have a considerable effect on the impact of interventions modeled in LiST, especially in food-insecure contexts. Conclusions: There was no single indicator identified that is ideal for measuring the percentage of the population who is food insecure for LiST. Thus, LiST will use the food security indicators that were used in the meta-analyses that produced the effect estimates. These are the poverty head-count ratio at $1.90/d for CF interventions and the prevalence of a low body mass index in women of reproductive age for balanced energy supplementation interventions. © 2017 American Society for Nutrition.

  10. Priority List of Research Areas for Radiological Nuclear Threat Countermeasures

    DTIC Science & Technology

    2005-01-01

    promote recovery in animal models (1, 4, 15). G-CSF ( Filgrastim , Neupogent), pegylated G-CSF (pegfilgrastim, Neulastat), GM-CSF (sargramostim, Leukinet...tools to carefully assess mechanisms of radiation damage, biomark- ers for biodosimetry, etc. Primates, dogs , ferrets, mice and non-mammalian species

  11. Forecasting weed distributions using climate data: a GIS early warning tool

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Holcombe, Tracy R.; Barnett, David T.; Stohlgren, Thomas J.; Kartesz, John T.

    2010-01-01

    The number of invasive exotic plant species establishing in the United States is continuing to rise. When prevention of exotic species from entering into a country fails at the national level and the species establishes, reproduces, spreads, and becomes invasive, the most successful action at a local level is early detection followed eradication. We have developed a simple geographic information system (GIS) analysis for developing watch lists for early detection of invasive exotic plants that relies upon currently available species distribution data coupled with environmental data to aid in describing coarse-scale potential distributions. This GIS analysis tool develops environmental envelopes for species based upon the known distribution of a species thought to be invasive and represents the first approximation of its potential habitat while the necessary data are collected to perform more in­-depth analyses. To validate this method we looked at a time series of species distributions for 66 species in Pacific Northwest, and northern Rocky Mountain counties. The time series analysis presented here did select counties that the invasive exotic weeds invaded in subsequent years, showing that this technique could be useful in developing watch lists for the spread of particular exotic species. We applied this same habitat-matching model based upon bioclimaric envelopes to 100 invasive exotics with various levels of known distributions within continental U.S. counties. For species with climatically limited distributions, county watch lists describe county-specific vulnerability to invasion. Species with matching habitats in a county would be added to that county's list. These watch lists can influence management decisions for early warning, control prioritization, and targeted research to determine specific locations within vulnerable counties. This tool provides useful information for rapid assessment of the potential distribution based upon climate envelopes of current distributions for new invasive exotic species.

  12. Natural language processing and inference rules as strategies for updating problem list in an electronic health record.

    PubMed

    Plazzotta, Fernando; Otero, Carlos; Luna, Daniel; de Quiros, Fernan Gonzalez Bernaldo

    2013-01-01

    Physicians do not always keep the problem list accurate, complete and updated. To analyze natural language processing (NLP) techniques and inference rules as strategies to maintain completeness and accuracy of the problem list in EHRs. Non systematic literature review in PubMed, in the last 10 years. Strategies to maintain the EHRs problem list were analyzed in two ways: inputting and removing problems from the problem list. NLP and inference rules have acceptable performance for inputting problems into the problem list. No studies using these techniques for removing problems were published Conclusion: Both tools, NLP and inference rules have had acceptable results as tools for maintain the completeness and accuracy of the problem list.

  13. Aligning institutional priorities: engaging house staff in a quality improvement and safety initiative to fulfill Clinical Learning Environment Review objectives and electronic medical record Meaningful Use requirements.

    PubMed

    Flanagan, Meghan R; Foster, Carolyn C; Schleyer, Anneliese; Peterson, Gene N; Mandell, Samuel P; Rudd, Kristina E; Joyner, Byron D; Payne, Thomas H

    2016-02-01

    House staff quality improvement projects are often not aligned with training institution priorities. House staff are the primary users of inpatient problem lists in academic medical centers, and list maintenance has significant patient safety and financial implications. Improvement of the problem list is an important objective for hospitals with electronic health records under the Meaningful Use program. House staff surveys were used to create an electronic problem list manager (PLM) tool enabling efficient problem list updating. Number of new problems added and house staff perceptions of the problem list were compared before and after PLM intervention. The PLM was used by 654 house staff after release. Surveys demonstrated increased problem list updating (P = .002; response rate 47%). Mean new problems added per day increased from 64 pre-PLM to 125 post-PLM (P < .001). This innovative project serves as a model for successful engagement of house staff in institutional quality and safety initiatives with tangible institutional benefits. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Putting the Wheel into Motion: Designing a Career Development Program for University Students

    ERIC Educational Resources Information Center

    Mackie, Barbara; Thomas, Jan

    2005-01-01

    This case study outlines an approach to design a career development program for university students using an adaptation of "the wheel" (Amundson & Poehnell, 2004). Ten elements of the model are listed and some of the key questions, tools and strategies that support each element of the model are highlighted. Its application in a variety of group…

  15. Graphical Tools for Linear Structural Equation Modeling

    DTIC Science & Technology

    2014-06-01

    others. 4Kenny and Milan (2011) write, “Identification is perhaps the most difficult concept for SEM researchers to understand. We have seen SEM...model to using typical SEM software to determine model identifia- bility. Kenny and Milan (2011) list the following drawbacks: (i) If poor starting...the well known recursive and null rules (Bollen, 1989) and the regression rule (Kenny and Milan , 2011). A Simple Criterion for Identifying Individual

  16. Impact of malaria interventions on child mortality in endemic African settings: comparison and alignment between LiST and Spectrum-Malaria model.

    PubMed

    Korenromp, Eline; Hamilton, Matthew; Sanders, Rachel; Mahiané, Guy; Briët, Olivier J T; Smith, Thomas; Winfrey, William; Walker, Neff; Stover, John

    2017-11-07

    In malaria-endemic countries, malaria prevention and treatment are critical for child health. In the context of intervention scale-up and rapid changes in endemicity, projections of intervention impact and optimized program scale-up strategies need to take into account the consequent dynamics of transmission and immunity. The new Spectrum-Malaria program planning tool was used to project health impacts of Insecticide-Treated mosquito Nets (ITNs) and effective management of uncomplicated malaria cases (CMU), among other interventions, on malaria infection prevalence, case incidence and mortality in children 0-4 years, 5-14 years of age and adults. Spectrum-Malaria uses statistical models fitted to simulations of the dynamic effects of increasing intervention coverage on these burdens as a function of baseline malaria endemicity, seasonality in transmission and malaria intervention coverage levels (estimated for years 2000 to 2015 by the World Health Organization and Malaria Atlas Project). Spectrum-Malaria projections of proportional reductions in under-five malaria mortality were compared with those of the Lives Saved Tool (LiST) for the Democratic Republic of the Congo and Zambia, for given (standardized) scenarios of ITN and/or CMU scale-up over 2016-2030. Proportional mortality reductions over the first two years following scale-up of ITNs from near-zero baselines to moderately higher coverages align well between LiST and Spectrum-Malaria -as expected since both models were fitted to cluster-randomized ITN trials in moderate-to-high-endemic settings with 2-year durations. For further scale-up from moderately high ITN coverage to near-universal coverage (as currently relevant for strategic planning for many countries), Spectrum-Malaria predicts smaller additional ITN impacts than LiST, reflecting progressive saturation. For CMU, especially in the longer term (over 2022-2030) and for lower-endemic settings (like Zambia), Spectrum-Malaria projects larger proportional impacts, reflecting onward dynamic effects not fully captured by LiST. Spectrum-Malaria complements LiST by extending the scope of malaria interventions, program packages and health outcomes that can be evaluated for policy making and strategic planning within and beyond the perspective of child survival.

  17. LiST modelling with monitoring data to estimate impact on child mortality of an ORS and zinc programme with public sector providers in Bihar, India.

    PubMed

    Ayyanat, Jayachandran A; Harbour, Catherine; Kumar, Sanjeev; Singh, Manjula

    2018-01-05

    Many interventions have attempted to increase vulnerable and remote populations' access to ORS and zinc to reduce child mortality from diarrhoea. However, the impact of these interventions is difficult to measure. From 2010 to 15, Micronutrient Initiative (MI), worked with the public sector in Bihar, India to enable community health workers to treat and report uncomplicated child diarrhoea with ORS and zinc. We describe how we estimated programme's impact on child mortality with Lives Saved Tool (LiST) modelling and data from MI's management information system (MIS). This study demonstrates that using LiST modelling and MIS data are viable options for evaluating programmes to reduce child mortality. We used MI's programme monitoring data to estimate coverage rates and LiST modelling software to estimate programme impact on child mortality. Four scenarios estimated the effects of different rates of programme scale-up and programme coverage on estimated child mortality by measuring children's lives saved. The programme saved an estimated 806-975 children under-5 who had diarrhoea during five-year project phase. Increasing ORS and zinc coverage rates to 19.8% & 18.3% respectively under public sector coverage with effective treatment would have increased the programme's impact on child mortality and could have achieved the project goal of saving 4200 children's lives during the five-year programme. Programme monitoring data can be used with LiST modelling software to estimate coverage rates and programme impact on child mortality. This modelling approach may cost less and yield estimates sooner than directly measuring programme impact with population-based surveys. However, users must be cautious about relying on modelled estimates of impact and ensure that the programme monitoring data used is complete and precise about the programme aspects that are modelled. Otherwise, LiST may mis-estimate impact on child mortality. Further, LiST software may require modifications to its built-in assumptions to capture programmatic inputs. LiST assumes that mortality rates and cause of death structure change only in response to changes in programme coverage. In Bihar, overall child mortality has decreased and diarrhoea seems to be less lethal than previously, but at present LiST does not adjust its estimates for these sorts of changes.

  18. Creating Visual Materials for Multi-Handicapped Deaf Learners.

    ERIC Educational Resources Information Center

    Hack, Carole; Brosmith, Susan

    1980-01-01

    The article describes two groups of visual materials developed for multiply handicapped deaf teenagers. The daily living skills project includes vocabulary lists, visuals, games and a model related to household cleaning, personal grooming, or consumer skills. The occupational information project includes visuals of tools, materials, and clothing…

  19. Assignment: Eco-Friendly Campuses.

    ERIC Educational Resources Information Center

    Calkins, Meg

    2002-01-01

    Discusses how institutions of higher education can use their campus environments as a teaching tool and laboratory for finding solutions to environmental dilemmas and ensure that their campus operations, including the landscape, are exemplary models of environmental practice--even if it means far fewer expanses of lawn. Includes a list of…

  20. Application programs written by using customizing tools of a computer-aided design system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.; Huang, R.; Juricic, D.

    1995-12-31

    Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.

  1. Food Exchange List and Dietary Management of Non-Communicable Diseases in Cultural Perspective.

    PubMed

    Khan, Mahnaz Nasir; Kalsoom, Samia; Khan, Ayyaz Ali

    2017-01-01

    This review focuses at highlighting the importance of Food Exchange List in cultural perspective, as an effective dietary tool to help individuals' manage their dietary modifications in relation to non communicable diseases whilst specifying measures that can help improve the quality of Food Exchange Lists for combating various non communicable diseases and addressing adherence related issues to specialized diets. A search was done using PubMed & Google Scholar till June 2016. Search terms used were food exchange list AND disease, diet AND non-communicable diseases. We included only studies that discussed Food Exchange List (FEL) in relation to non-communicable diseases; in addition to factors like cultural relevance and adherence. Out of the 837 papers accessed 57 were identified as relevant to the Food Exchange List, out of which 39 papers were focused to the concept and development of the Food Exchange List. Only 18 discussed FEL in relation to non communicable diseases and were thus included in the review. Food exchange list is a user friendly tool for dietary modification due to disease. This tool may help to customize meals for people as it provides information regarding various food items in different groups. This tool is helpful in reducing blood & plasma glucose levels, maintaining lipid profile & effectively combating other diet related diseases & those ailments in which diet plays a significant role in maintenance & prevention from reoccurrences. However, better management and adherence to modified diets for non communicable diseases can be ensured by keeping cultural relevance under consideration before using Food Exchange Lists for such diseases.

  2. Machine Tool Technology. Automatic Screw Machine Troubleshooting & Set-Up Training Outlines [and] Basic Operator's Skills Set List.

    ERIC Educational Resources Information Center

    Anoka-Hennepin Technical Coll., Minneapolis, MN.

    This set of two training outlines and one basic skills set list are designed for a machine tool technology program developed during a project to retrain defense industry workers at risk of job loss or dislocation because of conversion of the defense industry. The first troubleshooting training outline lists the categories of problems that develop…

  3. Automation life-cycle cost model

    NASA Technical Reports Server (NTRS)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  4. BIM-Based Timber Structures Refurbishment of the Immovable Heritage Listed Buildings

    NASA Astrophysics Data System (ADS)

    Henek, Vladan; Venkrbec, Václav

    2017-12-01

    The use of Building information model (BIM) design tools is no longer an exception, but a common issue. When designing new buildings or complex renovations using BIM, the benefits have already been repeatedly published. The essence of BIM is to create a multidimensional geometric model of a planned building electronically on a computer, supplemented with the necessary information in advance of the construction process. Refurbishment is a specific process that combines both - new structures and demolished structures, or structures that need to be dismantled, repaired, and then returned to the original position. Often it can be historically valuable part of the building. BIM-based repairs and refurbishments of the constructions, especially complicated repairs of the structures of roof trusses of immovable heritage listed buildings, have not yet been credibly presented. However, the use of BIM tools may be advantageous in this area, because user can quickly response to the necessary changes that may be needed during refurbishments, but also in connection with the quick assessment and cost estimation of any unexpected additional works. The paper deals with the use of BIM in the field of repairs and refurbishment of the buildings in general. The emphasis on monumentally protected elements was priority. Advantage of the proposal research is demonstrated on case study of the refurbishment of the immovable heritage listed truss roof. According to this study, this construction was realized in the Czech Republic. Case study consists of 3D modelled truss parts and the connected technological workflow base. The project work was carried out in one common model environment.

  5. Designing Interference-Robust Wireless Mesh Networks Using a Defender-Attacker-Defender Model

    DTIC Science & Technology

    2015-02-01

    solution does not provide more network flow than the undefended attacker’s solution. (However, our tool stores alternate, runner -up solutions that often...approximate real WMNs. 51 LIST OF REFERENCES Alderson, D.L., Brown, G.G., & Carlyle, W.M. (2014). Assessing and improving operational resilience

  6. Design of Training Systems, Phase II Report, Volume III; Model Program Descriptions and Operating Procedures. TAEG Report No. 12-2.

    ERIC Educational Resources Information Center

    Naval Training Equipment Center, Orlando, FL. Training Analysis and Evaluation Group.

    The Design of Training Systems (DOTS) project was initiated by the Department of Defense (DOD) to develop tools for the effective management of military training organizations. Volume 3 contains the model and data base program descriptions and operating procedures designed for phase 2 of the project. Flow charts and program listings for the…

  7. Design of an integrated airframe/propulsion control system architecture

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Lee, C. William; Strickland, Michael J.

    1990-01-01

    The design of an integrated airframe/propulsion control system architecture is described. The design is based on a prevalidation methodology that used both reliability and performance tools. An account is given of the motivation for the final design and problems associated with both reliability and performance modeling. The appendices contain a listing of the code for both the reliability and performance model used in the design.

  8. The Chinchilla Research Resource Database: resource for an otolaryngology disease model

    PubMed Central

    Shimoyama, Mary; Smith, Jennifer R.; De Pons, Jeff; Tutaj, Marek; Khampang, Pawjai; Hong, Wenzhou; Erbe, Christy B.; Ehrlich, Garth D.; Bakaletz, Lauren O.; Kerschner, Joseph E.

    2016-01-01

    The long-tailed chinchilla (Chinchilla lanigera) is an established animal model for diseases of the inner and middle ear, among others. In particular, chinchilla is commonly used to study diseases involving viral and bacterial pathogens and polymicrobial infections of the upper respiratory tract and the ear, such as otitis media. The value of the chinchilla as a model for human diseases prompted the sequencing of its genome in 2012 and the more recent development of the Chinchilla Research Resource Database (http://crrd.mcw.edu) to provide investigators with easy access to relevant datasets and software tools to enhance their research. The Chinchilla Research Resource Database contains a complete catalog of genes for chinchilla and, for comparative purposes, human. Chinchilla genes can be viewed in the context of their genomic scaffold positions using the JBrowse genome browser. In contrast to the corresponding records at NCBI, individual gene reports at CRRD include functional annotations for Disease, Gene Ontology (GO) Biological Process, GO Molecular Function, GO Cellular Component and Pathway assigned to chinchilla genes based on annotations from the corresponding human orthologs. Data can be retrieved via keyword and gene-specific searches. Lists of genes with similar functional attributes can be assembled by leveraging the hierarchical structure of the Disease, GO and Pathway vocabularies through the Ontology Search and Browser tool. Such lists can then be further analyzed for commonalities using the Gene Annotator (GA) Tool. All data in the Chinchilla Research Resource Database is freely accessible and downloadable via the CRRD FTP site or using the download functions available in the search and analysis tools. The Chinchilla Research Resource Database is a rich resource for researchers using, or considering the use of, chinchilla as a model for human disease. Database URL: http://crrd.mcw.edu PMID:27173523

  9. Core Journal Lists: Classic Tool, New Relevance

    ERIC Educational Resources Information Center

    Paynter, Robin A.; Jackson, Rose M.; Mullen, Laura Bowering

    2010-01-01

    Reviews the historical context of core journal lists, current uses in collection assessment, and existing methodologies for creating lists. Outlines two next generation core list projects developing new methodologies and integrating novel information/data sources to improve precision: a national-level core psychology list and the other a local…

  10. An Object-Oriented Python Implementation of an Intermediate-Level Atmospheric Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.

    2008-12-01

    The Neelin-Zeng Quasi-equilibrium Tropical Circulation Model (QTCM1) is a Fortran-based intermediate-level atmospheric model that includes simplified treatments of several physical processes, including a GCM-like convective scheme and a land-surface scheme with representations of different surface types, evaporation, and soil moisture. This model has been used in studies of the Madden-Julian oscillation, ENSO, and vegetation-atmosphere interaction effects on climate. Through the assumption of convective quasi-equilibrium in the troposphere, the QTCM1 is able to include full nonlinearity, resolve baroclinic disturbances, and generate a reasonable climatology, all at low computational cost. One year of simulation on a PC at 5.625 × 3.75 degree longitude-latitude resolution takes under three minutes of wall-clock time. The Python package qtcm implements the QTCM1 in a mixed-language environment that retains the speed of compiled Fortran while providing the benefits of Python's object-oriented framework and robust suite of utilities and datatypes. We describe key programming constructs used to create this modeling environment: the decomposition of model runs into Python objects, providing methods so visualization tools are attached to model runs, and the use of Python's mutable datatypes (lists and dictionaries) to implement the "run list" entity, which enables total runtime control of subroutine execution order and content. The result is an interactive modeling environment where the traditional sequence of "hypothesis → modeling → visualization and analysis" is opened up and made nonlinear and flexible. In this environment, science tasks such as parameter-space exploration and testing alternative parameterizations can be easily automated, without the need for multiple versions of the model code interacting with a bevy of makefiles and shell scripts. The environment also simplifies interfacing of the atmospheric model to other models (e.g., hydrologic models, statistical models) and analysis tools. The tools developed for this package can be adapted to create similar environments for hydrologic models.

  11. [Elaboration and validation of a tool to measure psychological well-being: WBMMS].

    PubMed

    Massé, R; Poulin, C; Dassa, C; Lambert, J; Bélair, S; Battaglini, M A

    1998-01-01

    Psychological well-being scales used in epidemiologic surveys usually show high construct validity. The content validation, however, is less convincing since these scales rest on lists of items that reflect the theoretical model of the authors. In this study we present results of the construct and criterion validation of a new Well-Being Manifestations Measure Scale (WBMMS) founded on an initial list of manifestations derived from an original content validation in a general population. It is concluded that national and public health epidemiologic surveys should include both measures of positive and negative mental health.

  12. MPI_XSTAR: MPI-based parallelization of XSTAR program

    NASA Astrophysics Data System (ADS)

    Danehkar, A.

    2017-12-01

    MPI_XSTAR parallelizes execution of multiple XSTAR runs using Message Passing Interface (MPI). XSTAR (ascl:9910.008), part of the HEASARC's HEAsoft (ascl:1408.004) package, calculates the physical conditions and emission spectra of ionized gases. MPI_XSTAR invokes XSTINITABLE from HEASoft to generate a job list of XSTAR commands for given physical parameters. The job list is used to make directories in ascending order, where each individual XSTAR is spawned on each processor and outputs are saved. HEASoft's XSTAR2TABLE program is invoked upon the contents of each directory in order to produce table model FITS files for spectroscopy analysis tools.

  13. An Analysis of Physician Assistant LibGuides: A Tool for Collection Development.

    PubMed

    Johnson, Catherine V; Johnson, Scott Y

    2017-01-01

    The Physician Assistant (PA) specialty encompasses many subject areas and requires many types of library resources. An analysis of PA LibGuides was performed to determine most frequently recommended resources. A sample of LibGuides from U.S. institutions accredited by the Accreditation Review Commission on Education for the Physician Assistant (ARC-PA) was included in this study. Resources presented on guides were tabulated and organized by resource type. Databases and point-of-care tools were the types of resources listed by the most LibGuides. There were over 1,000 books listed on the 45 guides, including over 600 unique books listed. There were fewer journals listed, only 163. Overall, while the 45 LibGuides evaluated list many unique resources in each category, a librarian can create an accepted list of the most frequently listed resources from the data gathered.

  14. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  15. POWERFUL NEW TOOLS FOR ANALYZING ENVIRONMENTAL CONTAMINATION: MASS PEAK PROFILING FROM SELECTED-ION RECORDING DATA AND A PROFILE GENERATION MODEL

    EPA Science Inventory

    Capillary gas chromatography with mass spectrometric detection is the most commonly used technique for analyzing samples from Superfund sites. While the U.S. EPA has developed target lists of compounds for which library mass spectra are available on most mass spectrometer data s...

  16. Modeling Computer Communication Networks in a Realistic 3D Environment

    DTIC Science & Technology

    2010-03-01

    50 2. Comparison of visualization tools . . . . . . . . . . . . . . . . . 75 xi List of Abbreviations Abbreviation Page 2D two-dimensional...International Conference on, 77 –84, 2001. 20. National Defense and the Canadian Forces. “Joint Fires Support”. URL http: //www.cfd-cdf.forces.gc.ca/sites/ page ...Table of Contents Page Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv Acknowledgements

  17. Comparing estimates of child mortality reduction modelled in LiST with pregnancy history survey data for a community-based NGO project in Mozambique

    PubMed Central

    2011-01-01

    Background There is a growing body of evidence that integrated packages of community-based interventions, a form of programming often implemented by NGOs, can have substantial child mortality impact. More countries may be able to meet Millennium Development Goal (MDG) 4 targets by leveraging such programming. Analysis of the mortality effect of this type of programming is hampered by the cost and complexity of direct mortality measurement. The Lives Saved Tool (LiST) produces an estimate of mortality reduction by modelling the mortality effect of changes in population coverage of individual child health interventions. However, few studies to date have compared the LiST estimates of mortality reduction with those produced by direct measurement. Methods Using results of a recent review of evidence for community-based child health programming, a search was conducted for NGO child health projects implementing community-based interventions that had independently verified child mortality reduction estimates, as well as population coverage data for modelling in LiST. One child survival project fit inclusion criteria. Subsequent searches of the USAID Development Experience Clearinghouse and Child Survival Grants databases and interviews of staff from NGOs identified no additional projects. Eight coverage indicators, covering all the project’s technical interventions were modelled in LiST, along with indicator values for most other non-project interventions in LiST, mainly from DHS data from 1997 and 2003. Results The project studied was implemented by World Relief from 1999 to 2003 in Gaza Province, Mozambique. An independent evaluation collecting pregnancy history data estimated that under-five mortality declined 37% and infant mortality 48%. Using project-collected coverage data, LiST produced estimates of 39% and 34% decline, respectively. Conclusions LiST gives reasonably accurate estimates of infant and child mortality decline in an area where a package of community-based interventions was implemented. This and other validation exercises support use of LiST as an aid for program planning to tailor packages of community-based interventions to the epidemiological context and for project evaluation. Such targeted planning and assessments will be useful to accelerate progress in reaching MDG4 targets. PMID:21501454

  18. Office-Based Tools and Primary Care Visit Communication, Length, and Preventive Service Delivery.

    PubMed

    Lafata, Jennifer Elston; Shay, L Aubree; Brown, Richard; Street, Richard L

    2016-04-01

    The use of physician office-based tools such as electronic health records (EHRs), health risk appraisal (HRA) instruments, and written patient reminder lists is encouraged to support efficient, high-quality, patient-centered care. We evaluate the association of exam room use of EHRs, HRA instruments, and self-generated written patient reminder lists with patient-physician communication behaviors, recommended preventive health service delivery, and visit length. Observational study of 485 office visits with 64 primary care physicians practicing in a health system serving the Detroit metropolitan area. Study data were obtained from patient surveys, direct observation, office visit audio-recordings, and automated health system records. Outcome measures included visit length in minutes, patient use of active communication behaviors, physician use of supportive talk and partnership-building communication behaviors, and percentage of delivered guideline-recommended preventive health services for which patients are eligible and due. Simultaneous linear regression models were used to evaluate associations between tool use and outcomes. Adjusted models controlled for patient characteristics, physician characteristics, characteristics of the relationship between the patient and physician, and characteristics of the environment in which the visit took place. Prior to adjusting for other factors, visits in which the EHR was used on average were significantly (p < .05) longer (27.6 vs. 23.8 minutes) and contained fewer preventive services for which patients were eligible and due (56.5 percent vs. 62.7 percent) compared to those without EHR use. Patient written reminder lists were also significantly associated with longer visits (30.0 vs. 26.5 minutes), and less use of physician communication behaviors facilitating patient involvement (2.1 vs. 2.6 occurrences), but more use of active patient communication behaviors (4.4 vs. 2.6). Likewise, HRA use was significantly associated with increased preventive services delivery (62.1 percent vs. 57.0 percent). All relationships remained significant (p > .05) in adjusted models with the exception of that between HRA use and preventive service delivery. Office-based tools intended to facilitate the implementation of desired primary care practice redesign are associated with both positive and negative cost and quality outcomes. Findings highlight the need for monitoring both intended and unintended consequences of office-based tools commonly used in primary care practice redesign. © Health Research and Educational Trust.

  19. The ATS Web Page Provides "Tool Boxes" for: Access Opportunities, Performance, Interfaces, Volume, Environments, "Wish List" Entry and Educational Outreach

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This viewgraph presentation gives an overview of the Access to Space website, including information on the 'tool boxes' available on the website for access opportunities, performance, interfaces, volume, environments, 'wish list' entry, and educational outreach.

  20. Constructing and Modifying Sequence Statistics for relevent Using informR in 𝖱

    PubMed Central

    Marcum, Christopher Steven; Butts, Carter T.

    2015-01-01

    The informR package greatly simplifies the analysis of complex event histories in 𝖱 by providing user friendly tools to build sufficient statistics for the relevent package. Historically, building sufficient statistics to model event sequences (of the form a→b) using the egocentric generalization of Butts’ (2008) relational event framework for modeling social action has been cumbersome. The informR package simplifies the construction of the complex list of arrays needed by the rem() model fitting for a variety of cases involving egocentric event data, multiple event types, and/or support constraints. This paper introduces these tools using examples from real data extracted from the American Time Use Survey. PMID:26185488

  1. Power Pack Your Center Brochure--Keys To Developing an Effective Marketing Tool.

    ERIC Educational Resources Information Center

    Wassom, Julie

    2000-01-01

    Suggests that a center's informational brochure can be either a valuable marketing tool or an expensive elimination incentive. Provides a list of five questions center directors should address before beginning the design and development of the center brochure. Lists seven guidelines for brochure development. (SD)

  2. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  3. Protecting Endangered Species: Do the Main Legislative Tools Work?

    PubMed Central

    Gibbs, Katherine E.; Currie, David J.

    2012-01-01

    It is critical to assess the effectiveness of the tools used to protect endangered species. The main tools enabled under the U.S. Endangered Species Act (ESA) to promote species recovery are funding, recovery plan development and critical habitat designation. Earlier studies sometimes found that statistically significant effects of these tools could be detected, but they have not answered the question of whether the effects were large enough to be biologically meaningful. Here, we ask: how much does the recovery status of ESA-listed species improve with the application of these tools? We used species' staus reports to Congress from 1988 to 2006 to quantify two measures of recovery for 1179 species. We related these to the amount of federal funding, years with a recovery plan, years with critical habitat designation, the amount of peer-reviewed scientific information, and time listed. We found that change in recovery status of listed species was, at best, only very weakly related to any of these tools. Recovery was positively related to the number of years listed, years with a recovery plan, and funding, however, these tools combined explain <13% of the variation in recovery status among species. Earlier studies that reported significant effects of these tools did not focus on effect sizes; however, they are in fact similarly small. One must conclude either that these tools are not very effective in promoting species' recovery, or (as we suspect) that species recovery data are so poor that it is impossible to tell whether the tools are effective or not. It is critically important to assess the effectiveness of tools used to promote species recovery; it is therefore also critically important to obtain population status data that are adequate to that task. PMID:22567111

  4. Protecting endangered species: do the main legislative tools work?

    PubMed

    Gibbs, Katherine E; Currie, David J

    2012-01-01

    It is critical to assess the effectiveness of the tools used to protect endangered species. The main tools enabled under the U.S. Endangered Species Act (ESA) to promote species recovery are funding, recovery plan development and critical habitat designation. Earlier studies sometimes found that statistically significant effects of these tools could be detected, but they have not answered the question of whether the effects were large enough to be biologically meaningful. Here, we ask: how much does the recovery status of ESA-listed species improve with the application of these tools? We used species' staus reports to Congress from 1988 to 2006 to quantify two measures of recovery for 1179 species. We related these to the amount of federal funding, years with a recovery plan, years with critical habitat designation, the amount of peer-reviewed scientific information, and time listed. We found that change in recovery status of listed species was, at best, only very weakly related to any of these tools. Recovery was positively related to the number of years listed, years with a recovery plan, and funding, however, these tools combined explain <13% of the variation in recovery status among species. Earlier studies that reported significant effects of these tools did not focus on effect sizes; however, they are in fact similarly small. One must conclude either that these tools are not very effective in promoting species' recovery, or (as we suspect) that species recovery data are so poor that it is impossible to tell whether the tools are effective or not. It is critically important to assess the effectiveness of tools used to promote species recovery; it is therefore also critically important to obtain population status data that are adequate to that task.

  5. Federation Development and Execution Process (FEDEP) Tools in Support of NATO Modelling & Simulation (M&S) Programmes (Des outils d’aide au processus de d’eveloppement et d’execution de federations (FEDEP))

    DTIC Science & Technology

    2004-05-01

    currently contains 79 tools and others should be added as they become known. Finally, the Task Group has recommended that the tool list be made available...approach and analysis. Conclusions and recommendations are contained in Chapter 5. RTO-TR-MSG-005 v Des outils d’aide au processus de développement...generation, Version 1.5 [A.3-1], was created in December 1999 and contained only minor editorial changes. RTO-TR-MSG-005 2 - 1 FEDEP With this

  6. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  7. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Astrophysics Data System (ADS)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  8. An analysis of three levels of scaled-up coverage for 28 interventions to avert stillbirths and maternal, newborn and child mortality in 27 countries in Latin America and the Caribbean with the Lives Saved Tool (LiST).

    PubMed

    Arnesen, Lauren; O'Connell, Thomas; Brumana, Luisa; Durán, Pablo

    2016-07-22

    Action to avert maternal and child mortality was propelled by the Millennium Development Goals (MDGs) in 2000. The Latin American and Caribbean (LAC) region has shown promise in achieving the MDGs in many countries, but preventable maternal, neonatal and child mortality persist. Furthermore, preventable stillbirths are occurring in large numbers in the region. While an effective set of maternal, newborn and child health (MNCH) interventions have been identified, they have not been brought to scale across LAC. Baseline data for select MNCH interventions for 27 LAC countries that are included in the Lives Saved Tool (LiST) were verified and updated with survey data. Three LiST projections were built for each country: baseline, MDG-focused, and All Included, each scaling up a progressively larger set of interventions for 2015 - 2030. Impact was assessed for 2015 - 2035, comparing annual and total lives saved, as projected by LiST. Across the 27 countries 235,532 stillbirths, and 752,588 neonatal, 959,393 under-five, and 60,858 maternal deaths would be averted between 2015 and 2035 by implementing the All-Included intervention package, representing 67 %, 616 %, 807 % and 101 % more lives saved, respectively, than with the MDG-focused interventions. 25 % neonatal deaths averted with the All-Included intervention package would be due to asphyxia, 42 % from prematurity and 24 % from sepsis. Our modelling suggests a 337 % increase in the number of lives saved, which would have enormous impacts on population health. Further research could help clarify the impacts of a comprehensive scale-up of the full range of essential MNCH interventions we have modelled.

  9. An Observational Study to Evaluate the Usability and Intent to Adopt an Artificial Intelligence-Powered Medication Reconciliation Tool.

    PubMed

    Long, Ju; Yuan, Michael Juntao; Poonawala, Robina

    2016-05-16

    Medication reconciliation (the process of creating an accurate list of all medications a patient is taking) is a widely practiced procedure to reduce medication errors. It is mandated by the Joint Commission and reimbursed by Medicare. Yet, in practice, medication reconciliation is often not effective owing to knowledge gaps in the team. A promising approach to improve medication reconciliation is to incorporate artificial intelligence (AI) decision support tools into the process to engage patients and bridge the knowledge gap. The aim of this study was to improve the accuracy and efficiency of medication reconciliation by engaging the patient, the nurse, and the physician as a team via an iPad tool. With assistance from the AI agent, the patient will review his or her own medication list from the electronic medical record (EMR) and annotate changes, before reviewing together with the physician and making decisions on the shared iPad screen. In this study, we developed iPad-based software tools, with AI decision support, to engage patients to "self-service" medication reconciliation and then share the annotated reconciled list with the physician. To evaluate the software tool's user interface and workflow, a small number of patients (10) in a primary care clinic were recruited, and they were observed through the whole process during a pilot study. The patients are surveyed for the tool's usability afterward. All patients were able to complete the medication reconciliation process correctly. Every patient found at least one error or other issues with their EMR medication lists. All of them reported that the tool was easy to use, and 8 of 10 patients reported that they will use the tool in the future. However, few patients interacted with the learning modules in the tool. The physician and nurses reported the tool to be easy-to-use, easy to integrate into existing workflow, and potentially time-saving. We have developed a promising tool for a new approach to medication reconciliation. It has the potential to create more accurate medication lists faster, while better informing the patients about their medications and reducing burden on clinicians.

  10. Three-Dimensional Sensor Common Operating Picture (3-D Sensor COP)

    DTIC Science & Technology

    2017-01-01

    created. Additionally, a 3-D model of the sensor itself can be created. Using these 3-D models, along with emerging virtual and augmented reality tools...augmented reality 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 20 19a...iii Contents List of Figures iv 1. Introduction 1 2. The 3-D Sensor COP 2 3. Virtual Sensor Placement 7 4. Conclusions 10 5. References 11

  11. Bayesian networks for satellite payload testing

    NASA Astrophysics Data System (ADS)

    Przytula, Krzysztof W.; Hagen, Frank; Yung, Kar

    1999-11-01

    Satellite payloads are fast increasing in complexity, resulting in commensurate growth in cost of manufacturing and operation. A need exists for a software tool, which would assist engineers in production and operation of satellite systems. We have designed and implemented a software tool, which performs part of this task. The tool aids a test engineer in debugging satellite payloads during system testing. At this stage of satellite integration and testing both the tested payload and the testing equipment represent complicated systems consisting of a very large number of components and devices. When an error is detected during execution of a test procedure, the tool presents to the engineer a ranked list of potential sources of the error and a list of recommended further tests. The engineer decides this on this basis if to perform some of the recommended additional test or replace the suspect component. The tool has been installed in payload testing facility. The tool is based on Bayesian networks, a graphical method of representing uncertainty in terms of probabilistic influences. The Bayesian network was configured using detailed flow diagrams of testing procedures and block diagrams of the payload and testing hardware. The conditional and prior probability values were initially obtained from experts and refined in later stages of design. The Bayesian network provided a very informative model of the payload and testing equipment and inspired many new ideas regarding the future test procedures and testing equipment configurations. The tool is the first step in developing a family of tools for various phases of satellite integration and operation.

  12. Ascent/Descent Software

    NASA Technical Reports Server (NTRS)

    Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben

    2012-01-01

    The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.

  13. Common Core State Standards: Implementation Tools and Resources

    ERIC Educational Resources Information Center

    Council of Chief State School Officers, 2013

    2013-01-01

    The Council of Chief State School Officers (CCSSO or the Council) developed this list of free tools and resources to support state education agencies, districts, and educators during the process of implementing the Common Core State Standards (CCSS). This document primarily lists resources developed by CCSSO and other leading organizations and is…

  14. Application of the Web-based Interspecies Correlation Estimation (Web-ICE) tool to assess risks of national pesticide registrations to federally listed (threatened and endangered) species

    EPA Science Inventory

    The National Academy of Science (NAS) recently recommended exploration of predictive tools, such as interspecies correlation estimation (ICE), to estimate acute toxicity values for listed species and support development of species sensitivity distributions (SSDs). We explored the...

  15. Stationary Engineers Apprenticeship. Related Training Modules. 4.1-4.5 Tools.

    ERIC Educational Resources Information Center

    Lane Community Coll., Eugene, OR.

    This packet of five learning modules on tools is one of 20 such packets developed for apprenticeship training for stationary engineers. Introductory materials are a complete listing of all available modules and a supplementary reference list. Each module contains some or all of these components: a lesson goal, performance indicators, study guide…

  16. Scheduling algorithm for mission planning and logistics evaluation users' guide

    NASA Technical Reports Server (NTRS)

    Chang, H.; Williams, J. M.

    1976-01-01

    The scheduling algorithm for mission planning and logistics evaluation (SAMPLE) program is a mission planning tool composed of three subsystems; the mission payloads subsystem (MPLS), which generates a list of feasible combinations from a payload model for a given calendar year; GREEDY, which is a heuristic model used to find the best traffic model; and the operations simulation and resources scheduling subsystem (OSARS), which determines traffic model feasibility for available resources. The SAMPLE provides the user with options to allow the execution of MPLS, GREEDY, GREEDY-OSARS, or MPLS-GREEDY-OSARS.

  17. FTire and puzzling tyre physics: teacher, not student

    NASA Astrophysics Data System (ADS)

    Gipser, Michael

    2016-04-01

    By means of some instructive examples, the contribution shows how even complex phenomena and relations in tyre physics are better understood by using a physics-based tyre simulation model like FTire. In contrast to approximation-based phenomenological models, such an approach will give insight into, rather than requiring description of, the tyre's behaviour. Examples studied here comprise * predicted influence of wheel load, inflation pressure, camber angle, and slow rolling speed on parking torque, * predicted influence of inflation pressure on cornering stiffness and pneumatic trail, * relaxation length: ramping up and down slip angle and wheel load, * handling characteristic on very rough roads, * a strange phenomenon: cleats that 'attract' a tyre. Related to these studies, user-friendly simulation tools on the basis of FTire are introduced, which help in understanding the above-mentioned complex tyre properties. One of these tools, being valuable both in teaching and for vehicle/tyre dynamics experts in industry and research, allows the user to interactively modify, during a running simulation, tyre geometry, material data, and operating conditions. The impact of these variations both on tyre forces and moments as well as on internal tyre states can be directly seen in a running animation, and later analysed with a large variety of post-processing tools. Animations for all case studies are available for download on http://www.cosin.eu/animations. All registered trademarks used here are properties of their respective owners.

  18. Modernizing the World Health Organization List of Essential Medicines for Preventing and Controlling Cardiovascular Diseases.

    PubMed

    Kishore, Sandeep P; Blank, Evan; Heller, David J; Patel, Amisha; Peters, Alexander; Price, Matthew; Vidula, Mahesh; Fuster, Valentin; Onuma, Oyere; Huffman, Mark D; Vedanthan, Rajesh

    2018-02-06

    The World Health Organization (WHO) Model List of Essential Medicines (EML) is a key tool for improving global access to medicines for all conditions, including cardiovascular diseases (CVDs). The WHO EML is used by member states to determine their national essential medicine lists and policies and to guide procurement of medicines in the public sector. Here, we describe our efforts to modernize the EML for global CVD prevention and control. We review the recent history of applications to add, delete, and change indications for CVD medicines, with the aim of aligning the list with contemporary clinical practice guidelines. We have identified 4 issues that affect decisions for the EML and may strengthen future applications: 1) cost and cost-effectiveness; 2) presence in clinical practice guidelines; 3) feedback loops; and 4) community engagement. We share our lessons to stimulate others in the global CVD community to embark on similar efforts. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  19. "My Heart Die in Me": Idioms of Distress and the Development of a Screening Tool for Mental Suffering in Southeast Liberia.

    PubMed

    Fabian, Katrin; Fannoh, Josiah; Washington, George G; Geninyan, Wilfred B; Nyachienga, Bethuel; Cyrus, Garmai; Hallowanger, Joyce N; Beste, Jason; Rao, Deepa; Wagenaar, Bradley H

    2018-05-04

    The integration of culturally salient idioms of distress into mental healthcare delivery is essential for effective screening, diagnosis, and treatment. This study systematically explored idioms, explanatory models, and conceptualizations in Maryland County, Liberia to develop a culturally-resonant screening tool for mental distress. We employed a sequential mixed-methods process of: (1) free-lists and semi-structured interviews (n = 20); patient chart reviews (n = 315); (2) pile-sort exercises, (n = 31); and (3) confirmatory focus group discussions (FGDs); (n = 3) from June to December 2017. Free-lists identified 64 idioms of distress, 36 of which were eliminated because they were poorly understood, stigmatizing, irrelevant, or redundant. The remaining 28 terms were used in pile-sort exercises to visualize the interrelatedness of idioms. Confirmatory FDGs occurred before and after the pile-sort exercise to explain findings. Four categories of idioms resulted, the most substantial of which included terms related to the heart and to the brain/mind. The final screening tool took into account 11 idioms and 6 physical symptoms extracted from patient chart reviews. This study provides the framework for culturally resonant mental healthcare by cataloguing language around mental distress and designing an emic screening tool for validation in a clinical setting.

  20. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  1. Enrichr: interactive and collaborative HTML5 gene list enrichment analysis tool

    PubMed Central

    2013-01-01

    Background System-wide profiling of genes and proteins in mammalian cells produce lists of differentially expressed genes/proteins that need to be further analyzed for their collective functions in order to extract new knowledge. Once unbiased lists of genes or proteins are generated from such experiments, these lists are used as input for computing enrichment with existing lists created from prior knowledge organized into gene-set libraries. While many enrichment analysis tools and gene-set libraries databases have been developed, there is still room for improvement. Results Here, we present Enrichr, an integrative web-based and mobile software application that includes new gene-set libraries, an alternative approach to rank enriched terms, and various interactive visualization approaches to display enrichment results using the JavaScript library, Data Driven Documents (D3). The software can also be embedded into any tool that performs gene list analysis. We applied Enrichr to analyze nine cancer cell lines by comparing their enrichment signatures to the enrichment signatures of matched normal tissues. We observed a common pattern of up regulation of the polycomb group PRC2 and enrichment for the histone mark H3K27me3 in many cancer cell lines, as well as alterations in Toll-like receptor and interlukin signaling in K562 cells when compared with normal myeloid CD33+ cells. Such analyses provide global visualization of critical differences between normal tissues and cancer cell lines but can be applied to many other scenarios. Conclusions Enrichr is an easy to use intuitive enrichment analysis web-based tool providing various types of visualization summaries of collective functions of gene lists. Enrichr is open source and freely available online at: http://amp.pharm.mssm.edu/Enrichr. PMID:23586463

  2. Enrichr: interactive and collaborative HTML5 gene list enrichment analysis tool.

    PubMed

    Chen, Edward Y; Tan, Christopher M; Kou, Yan; Duan, Qiaonan; Wang, Zichen; Meirelles, Gabriela Vaz; Clark, Neil R; Ma'ayan, Avi

    2013-04-15

    System-wide profiling of genes and proteins in mammalian cells produce lists of differentially expressed genes/proteins that need to be further analyzed for their collective functions in order to extract new knowledge. Once unbiased lists of genes or proteins are generated from such experiments, these lists are used as input for computing enrichment with existing lists created from prior knowledge organized into gene-set libraries. While many enrichment analysis tools and gene-set libraries databases have been developed, there is still room for improvement. Here, we present Enrichr, an integrative web-based and mobile software application that includes new gene-set libraries, an alternative approach to rank enriched terms, and various interactive visualization approaches to display enrichment results using the JavaScript library, Data Driven Documents (D3). The software can also be embedded into any tool that performs gene list analysis. We applied Enrichr to analyze nine cancer cell lines by comparing their enrichment signatures to the enrichment signatures of matched normal tissues. We observed a common pattern of up regulation of the polycomb group PRC2 and enrichment for the histone mark H3K27me3 in many cancer cell lines, as well as alterations in Toll-like receptor and interlukin signaling in K562 cells when compared with normal myeloid CD33+ cells. Such analyses provide global visualization of critical differences between normal tissues and cancer cell lines but can be applied to many other scenarios. Enrichr is an easy to use intuitive enrichment analysis web-based tool providing various types of visualization summaries of collective functions of gene lists. Enrichr is open source and freely available online at: http://amp.pharm.mssm.edu/Enrichr.

  3. Generic Skills. Secondary School Vocational Model for Craft Trades. Based on Data on the Use of 588 Tool Skills from 1600 Workers and Supervisors in 131 Occupations.

    ERIC Educational Resources Information Center

    Smith, Arthur De W.

    This pamphlet provides a hierarchy of skills, from those most often used in the craft trades to those less frequently used, indicating the extent to which these skills are used by workers in 24 different occupational groups. The pamphlet also provides a secondary school vocational model for craft trades along with lists of the identified skills.…

  4. Modeling, Simulation, and Analysis for State and Local Emergency Planning and Response: Concept of Operations

    DTIC Science & Technology

    2009-01-01

    must be considered for each threat. The Department of Homeland Security (DHS) has defined 15 National Planning Scenarios ( NPSs ), along with a...considered how to incorporate the 15 NPSs and the Target Capabilities List developed by DHS. Finally, we considered the work being done by Dr. Charles...suite of models and other tools hampers effective planning and re- sponse for all hazards, including the NPSs . The ES community has many meth- ods

  5. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  6. School environment assessment tools to address behavioural risk factors of non-communicable diseases: A scoping review.

    PubMed

    Saluja, Kiran; Rawal, Tina; Bassi, Shalini; Bhaumik, Soumyadeep; Singh, Ankur; Park, Min Hae; Kinra, Sanjay; Arora, Monika

    2018-06-01

    We aimed to identify, describe and analyse school environment assessment (SEA) tools that address behavioural risk factors (unhealthy diet, physical inactivity, tobacco and alcohol consumption) for non-communicable diseases (NCD). We searched in MEDLINE and Web of Science, hand-searched reference lists and contacted experts. Basic characteristics, measures assessed and measurement properties (validity, reliability, usability) of identified tools were extracted. We narratively synthesized the data and used content analysis to develop a list of measures used in the SEA tools. Twenty-four SEA tools were identified, mostly from developed countries. Out of these, 15 were questionnaire based, 8 were checklists or observation based tools and one tool used a combined checklist/observation based and telephonic questionnaire approach. Only 1 SEA tool had components related to all the four NCD risk factors, 2 SEA tools has assessed three NCD risk factors (diet/nutrition, physical activity, tobacco), 10 SEA tools has assessed two NCD risk factors (diet/nutrition and physical activity) and 11 SEA tools has assessed only one of the NCD risk factor. Several measures were used in the tools to assess the four NCD risk factors, but tobacco and alcohol was sparingly included. Measurement properties were reported for 14 tools. The review provides a comprehensive list of measures used in SEA tools which could be a valuable resource to guide future development of such tools. A valid and reliable SEA tool which could simultaneously evaluate all NCD risk factors, that has been tested in different settings with varying resource availability is needed.

  7. SiGe BiCMOS manufacturing platform for mmWave applications

    NASA Astrophysics Data System (ADS)

    Kar-Roy, Arjun; Howard, David; Preisler, Edward; Racanelli, Marco; Chaudhry, Samir; Blaschke, Volker

    2010-10-01

    TowerJazz offers high volume manufacturable commercial SiGe BiCMOS technology platforms to address the mmWave market. In this paper, first, the SiGe BiCMOS process technology platforms such as SBC18 and SBC13 are described. These manufacturing platforms integrate 200 GHz fT/fMAX SiGe NPN with deep trench isolation into 0.18μm and 0.13μm node CMOS processes along with high density 5.6fF/μm2 stacked MIM capacitors, high value polysilicon resistors, high-Q metal resistors, lateral PNP transistors, and triple well isolation using deep n-well for mixed-signal integration, and, multiple varactors and compact high-Q inductors for RF needs. Second, design enablement tools that maximize performance and lowers costs and time to market such as scalable PSP and HICUM models, statistical and Xsigma models, reliability modeling tools, process control model tools, inductor toolbox and transmission line models are described. Finally, demonstrations in silicon for mmWave applications in the areas of optical networking, mobile broadband, phased array radar, collision avoidance radar and W-band imaging are listed.

  8. The Effect of an Adaptive Online Learning Support in an Undergraduate Computer Course: An Exploration of Self-Regulation in Blended Contexts

    ERIC Educational Resources Information Center

    Ko, Chia-Yin

    2013-01-01

    In accordance with Zimmerman's self-regulated learning model, the proposed online learning tool in the current study was designed to support students in learning a challenging subject. The Self-Check List, Formative Self-Assessment, and Structured Online Discussion served goal-setting, self-monitoring, and self-reflective purposes. The…

  9. Specialized Word Lists--Survey of the Literature--Research Perspective

    ERIC Educational Resources Information Center

    Palinkaševic, Radmila

    2017-01-01

    Word lists present an essential tool in vocabulary teaching. Compilation of specific word lists for various fields is one of the most prominent branches of research in this field at the moment. New methodological changes in word list formation have been proposed because of the appearance of the New-GSL (Brezina & Gablasova, 2013) and AVL…

  10. Information Retrieval during Free Listing Is Biased by Memory: Evidence from Medicinal Plants.

    PubMed

    Sousa, Daniel Carvalho Pires de; Soldati, Gustavo Taboada; Monteiro, Julio Marcelino; Araújo, Thiago Antonio de Sousa; Albuquerque, Ulysses Paulino

    2016-01-01

    Free listing is a methodological tool that is widely used in various scientific disciplines. A typical assumption of this approach is that individual lists reflect a subset of total knowledge and that the first items listed are the most culturally important. However, little is known about how cognitive processes influence free lists. In this study, we assess how recent memory of use, autonoetic and anoetic memory, and long-term associative memory can affect the composition and order of items in free lists and evaluate whether free lists indicate the most important items. Based on a model of local knowledge about medicinal plants and their therapeutic targets, which was collected via individual semi-structured interviews, we classify each item recorded in free lists according to the last time that the item was used by the informant (recently or long ago), the type of relevant memory (autonoetic or anoetic memory) and the existing associations between therapeutic targets (similar or random). We find that individuals have a tendency to recall information about medicinal plants used during the preceding year and that the recalled plants were also the most important plants during this period. However, we find no trend in the recall of plants from long-term associative memory, although this phenomenon is well established in studies on cognitive psychology. We suggest that such evidence should be considered in studies that use lists of medicinal plants because this temporal cognitive limit on the retrieval of knowledge affects data interpretation.

  11. Population modeling for pesticide risk assessment of threatened species-A case study of a terrestrial plant, Boltonia decurrens.

    PubMed

    Schmolke, Amelie; Brain, Richard; Thorbek, Pernille; Perkins, Daniel; Forbes, Valery

    2017-02-01

    Although population models are recognized as necessary tools in the ecological risk assessment of pesticides, particularly for species listed under the Endangered Species Act, their application in this context is currently limited to very few cases. The authors developed a detailed, individual-based population model for a threatened plant species, the decurrent false aster (Boltonia decurrens), for application in pesticide risk assessment. Floods and competition with other plant species are known factors that drive the species' population dynamics and were included in the model approach. The authors use the model to compare the population-level effects of 5 toxicity surrogates applied to B. decurrens under varying environmental conditions. The model results suggest that the environmental conditions under which herbicide applications occur may have a higher impact on populations than organism-level sensitivities to an herbicide within a realistic range. Indirect effects may be as important as the direct effects of herbicide applications by shifting competition strength if competing species have different sensitivities to the herbicide. The model approach provides a case study for population-level risk assessments of listed species. Population-level effects of herbicides can be assessed in a realistic and species-specific context, and uncertainties can be addressed explicitly. The authors discuss how their approach can inform the future development and application of modeling for population-level risk assessments of listed species, and ecological risk assessment in general. Environ Toxicol Chem 2017;36:480-491. © 2016 SETAC. © 2016 SETAC.

  12. Dietary assessment in minority ethnic groups: a systematic review of instruments for portion-size estimation in the United Kingdom

    PubMed Central

    Almiron-Roig, Eva; Aitken, Amanda; Galloway, Catherine

    2017-01-01

    Context: Dietary assessment in minority ethnic groups is critical for surveillance programs and for implementing effective interventions. A major challenge is the accurate estimation of portion sizes for traditional foods and dishes. Objective: The aim of this systematic review was to assess records published up to 2014 describing a portion-size estimation element (PSEE) applicable to the dietary assessment of UK-residing ethnic minorities. Data sources, selection, and extraction: Electronic databases, internet sites, and theses repositories were searched, generating 5683 titles, from which 57 eligible full-text records were reviewed. Data analysis: Forty-two publications about minority ethnic groups (n = 20) or autochthonous populations (n = 22) were included. The most common PSEEs (47%) were combination tools (eg, food models and portion-size lists), followed by portion-size lists in questionnaires/guides (19%) and image-based and volumetric tools (17% each). Only 17% of PSEEs had been validated against weighed data. Conclusions: When developing ethnic-specific dietary assessment tools, it is important to consider customary portion sizes by sex and age, traditional household utensil usage, and population literacy levels. Combining multiple PSEEs may increase accuracy, but such methods require validation. PMID:28340101

  13. The European Medicines Agency experience with biomarker qualification.

    PubMed

    Manolis, Efthymios; Koch, Armin; Deforce, Dieter; Vamvakas, Spiros

    2015-01-01

    Since the launch of the qualification process in 2009, the CHMP reviewed/is reviewing 48 requests for qualification advice or opinion (as of Sept 2013) related to biomarkers (BM) or other novel drug development tools (e.g. patient reported outcome measures, modeling, and statistical methods). The qualification opinions are available on the EMA website (Qualification of novel methodologies for medicine development, http://www.ema.europa.eu/ema/index.jsp?curl=pages/regulation/document_listing/document_listing_000319.jsp&mid=WC0b01ac0580022bb0#section2 , 2013). Also there is a trend of increasing numbers of qualification requests to CHMP, indicative of the pace that targeted drug development and personalized medicine is gaining and the need to bring the new tools from research to drug development and clinical use. This chapter will focus on the regulatory experience gained so far from the CHMP qualification procedure. Basic qualification principles will be presented. Through qualification examples, we will elaborate on common grounds and divergences between the different stakeholders.

  14. CRT--Cascade Routing Tool to define and visualize flow paths for grid-based watershed models

    USGS Publications Warehouse

    Henson, Wesley R.; Medina, Rose L.; Mayers, C. Justin; Niswonger, Richard G.; Regan, R.S.

    2013-01-01

    The U.S. Geological Survey Cascade Routing Tool (CRT) is a computer application for watershed models that include the coupled Groundwater and Surface-water FLOW model, GSFLOW, and the Precipitation-Runoff Modeling System (PRMS). CRT generates output to define cascading surface and shallow subsurface flow paths for grid-based model domains. CRT requires a land-surface elevation for each hydrologic response unit (HRU) of the model grid; these elevations can be derived from a Digital Elevation Model raster data set of the area containing the model domain. Additionally, a list is required of the HRUs containing streams, swales, lakes, and other cascade termination features along with indices that uniquely define these features. Cascade flow paths are determined from the altitudes of each HRU. Cascade paths can cross any of the four faces of an HRU to a stream or to a lake within or adjacent to an HRU. Cascades can terminate at a stream, lake, or HRU that has been designated as a watershed outflow location.

  15. General Construction. Instructor Manual.

    ERIC Educational Resources Information Center

    Laborers-AGC Education and Training Fund, Pomfret Center, CT.

    This guide contains materials for a general construction course. Introductory materials include a list of videos, schedule for the 10-day course, and tools and material list. The course is divided into 10 sections. Each section consists of some or all of these components: list of trainee objectives, instructor notes, instructor outline,…

  16. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  17. Development of an Electronic Pediatric All-Cause Harm Measurement Tool Using a Modified Delphi Method.

    PubMed

    Stockwell, David Christopher; Bisarya, Hema; Classen, David C; Kirkendall, Eric S; Lachman, Peter I; Matlow, Anne G; Tham, Eric; Hyman, Dan; Lehman, Samuel M; Searles, Elizabeth; Muething, Stephen E; Sharek, Paul J

    2016-12-01

    To have impact on reducing harm in pediatric inpatients, an efficient and reliable process for harm detection is needed. This work describes the first step toward the development of a pediatric all-cause harm measurement tool by recognized experts in the field. An international group of leaders in pediatric patient safety and informatics were charged with developing a comprehensive pediatric inpatient all-cause harm measurement tool using a modified Delphi technique. The process was conducted in 5 distinct steps: (1) literature review of triggers (elements from a medical record that assist in identifying patient harm) for inclusion; (2) translation of triggers to likely associated harm, improving the ability for expert prioritization; (3) 2 applications of a modified Delphi selection approach with consensus criteria using severity and frequency of harm as well as detectability of the associated trigger as criteria to rate each trigger and associated harm; (4) developing specific trigger logic and relevant values when applicable; and (5) final vetting of the entire trigger list for pilot testing. Literature and expert panel review identified 108 triggers and associated harms suitable for consideration (steps 1 and 2). This list was pared to 64 triggers and their associated harms after the first of the 2 independent expert reviews. The second independent expert review led to further refinement of the trigger package, resulting in 46 items for inclusion (step 3). Adding in specific trigger logic expanded the list. Final review and voting resulted in a list of 51 triggers (steps 4 and 5). Application of a modified Delphi method on an expert-constructed list of 108 triggers, focusing on severity and frequency of harms as well as detectability of triggers in an electronic medical record, resulted in a final list of 51 pediatric triggers. Pilot testing this list of pediatric triggers to identify all-cause harm for pediatric inpatients is the next step to establish the appropriateness of each trigger for inclusion in a global pediatric safety measurement tool.

  18. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  19. Updates to the Demographic and Spatial Allocation Models to ...

    EPA Pesticide Factsheets

    EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.

  20. APT: Aperture Photometry Tool

    NASA Astrophysics Data System (ADS)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  1. Predicting what helminth parasites a fish species should have using Parasite Co-occurrence Modeler (PaCo)

    USGS Publications Warehouse

    Strona, Giovanni; Lafferty, Kevin D.

    2013-01-01

    Fish pathologists are often interested in which parasites would likely be present in a particular host. Parasite Co-occurrence Modeler (PaCo) is a tool for identifying a list of parasites known from fish species that are similar ecologically, phylogenetically, and geographically to the host of interest. PaCo uses data from FishBase (maximum length, growth rate, life span, age at maturity, trophic level, phylogeny, and biogeography) to estimate compatibility between a target host and parasite species–genera from the major helminth groups (Acanthocephala, Cestoda, Monogenea, Nematoda, and Trematoda). Users can include any combination of host attributes in a model. These unique features make PaCo an innovative tool for addressing both theoretical and applied questions in parasitology. In addition to predicting the occurrence of parasites, PaCo can be used to investigate how host characteristics shape parasite communities. To test the performance of the PaCo algorithm, we created 12,400 parasite lists by applying any possible combination of model parameters (248) to 50 fish hosts. We then measured the relative importance of each parameter by assessing their frequency in the best models for each host. Host phylogeny and host geography were identified as the most important factors, with both present in 88% of the best models. Habitat (64%) was identified in more than half of the best models. Among ecological parameters, trophic level (41%) was the most relevant while life span (34%), growth rate (32%), maximum length (28%), and age at maturity (20%) were less commonly linked to best models. PaCo is free to use at www.purl.oclc.org/fishpest.

  2. Predicting what helminth parasites a fish species should have using Parasite Co-occurrence Modeler (PaCo).

    PubMed

    Strona, Giovanni; Lafferty, Kevin D

    2013-02-01

    Fish pathologists are often interested in which parasites would likely be present in a particular host. Parasite Co-occurrence Modeler (PaCo) is a tool for identifying a list of parasites known from fish species that are similar ecologically, phylogenetically, and geographically to the host of interest. PaCo uses data from FishBase (maximum length, growth rate, life span, age at maturity, trophic level, phylogeny, and biogeography) to estimate compatibility between a target host and parasite species-genera from the major helminth groups (Acanthocephala, Cestoda, Monogenea, Nematoda, and Trematoda). Users can include any combination of host attributes in a model. These unique features make PaCo an innovative tool for addressing both theoretical and applied questions in parasitology. In addition to predicting the occurrence of parasites, PaCo can be used to investigate how host characteristics shape parasite communities. To test the performance of the PaCo algorithm, we created 12,400 parasite lists by applying any possible combination of model parameters (248) to 50 fish hosts. We then measured the relative importance of each parameter by assessing their frequency in the best models for each host. Host phylogeny and host geography were identified as the most important factors, with both present in 88% of the best models. Habitat (64%) was identified in more than half of the best models. Among ecological parameters, trophic level (41%) was the most relevant while life span (34%), growth rate (32%), maximum length (28%), and age at maturity (20%) were less commonly linked to best models. PaCo is free to use at www.purl.oclc.org/fishpest.

  3. Electronic health record tools' support of nurses' clinical judgment and team communication.

    PubMed

    Kossman, Susan P; Bonney, Leigh Ann; Kim, Myoung Jin

    2013-11-01

    Nurses need to quickly process information to form clinical judgments, communicate with the healthcare team, and guide optimal patient care. Electronic health records not only offer potential for enhanced care but also introduce unintended consequences through changes in workflow, clinical judgment, and communication. We investigated nurses' use of improvised (self-made) and electronic health record-generated cognitive artifacts on clinical judgment and team communication. Tanner's Clinical Judgment Model provided a framework and basis for questions in an online survey and focus group interviews. Findings indicated that (1) nurses rated self-made work lists and medication administration records highest for both clinical judgment and communication, (2) tools aided different dimensions of clinical judgment, and (3) interdisciplinary tools enhance team communication. Implications are that electronic health record tool redesign could better support nursing work.

  4. Lives saved from malaria prevention in Africa--evidence to sustain cost-effective gains.

    PubMed

    Korenromp, Eline L

    2012-03-28

    Lives saved have become a standard metric to express health benefits across interventions and diseases. Recent estimates of malaria-attributable under-five deaths prevented using the Lives Saved tool (LiST), extrapolating effectiveness estimates from community-randomized trials of scale-up of insecticide-treated nets (ITNs) in the 1990s, confirm the substantial impact and good cost-effectiveness that ITNs have achieved in high-endemic sub-Saharan Africa. An even higher cost-effectiveness would likely have been found if the modelling had included the additional indirect mortality impact of ITNs on preventing deaths from other common child illnesses, to which malaria contributes as a risk factor. As conventional ITNs are being replaced by long-lasting insecticidal nets and scale-up is expanded to target universal coverage for full, all-age populations at risk, enhanced transmission reduction may--above certain thresholds--enhance the mortality impact beyond that observed in the trials of the 1990s. On the other hand, lives saved by ITNs might fall if improved malaria case management with artemisinin-based combination therapy averts the deaths that ITNs would otherwise prevent.Validation and updating of LiST's simple assumption of a universal, fixed coverage-to-mortality-reduction ratio will require enhanced national programme and impact monitoring and evaluation. Key indicators for time trend analysis include malaria-related mortality from population-based surveys and vital registration, vector control and treatment coverage from surveys, and parasitologically-confirmed malaria cases and deaths recorded in health facilities. Indispensable is triangulation with dynamic transmission models, fitted to long-term trend data on vector, parasite and human populations over successive phases of malaria control and elimination.Sound, locally optimized budget allocation including on monitoring and evaluation priorities will benefit much if policy makers and programme planners use planning tools such as LiST - even when predictions are less certain than often understood. The ultimate success of LiST for supporting malaria prevention may be to prove its linear predictions less and less relevant.

  5. An Observational Study to Evaluate the Usability and Intent to Adopt an Artificial Intelligence–Powered Medication Reconciliation Tool

    PubMed Central

    Yuan, Michael Juntao; Poonawala, Robina

    2016-01-01

    Background Medication reconciliation (the process of creating an accurate list of all medications a patient is taking) is a widely practiced procedure to reduce medication errors. It is mandated by the Joint Commission and reimbursed by Medicare. Yet, in practice, medication reconciliation is often not effective owing to knowledge gaps in the team. A promising approach to improve medication reconciliation is to incorporate artificial intelligence (AI) decision support tools into the process to engage patients and bridge the knowledge gap. Objective The aim of this study was to improve the accuracy and efficiency of medication reconciliation by engaging the patient, the nurse, and the physician as a team via an iPad tool. With assistance from the AI agent, the patient will review his or her own medication list from the electronic medical record (EMR) and annotate changes, before reviewing together with the physician and making decisions on the shared iPad screen. Methods In this study, we developed iPad-based software tools, with AI decision support, to engage patients to “self-service” medication reconciliation and then share the annotated reconciled list with the physician. To evaluate the software tool’s user interface and workflow, a small number of patients (10) in a primary care clinic were recruited, and they were observed through the whole process during a pilot study. The patients are surveyed for the tool’s usability afterward. Results All patients were able to complete the medication reconciliation process correctly. Every patient found at least one error or other issues with their EMR medication lists. All of them reported that the tool was easy to use, and 8 of 10 patients reported that they will use the tool in the future. However, few patients interacted with the learning modules in the tool. The physician and nurses reported the tool to be easy-to-use, easy to integrate into existing workflow, and potentially time-saving. Conclusions We have developed a promising tool for a new approach to medication reconciliation. It has the potential to create more accurate medication lists faster, while better informing the patients about their medications and reducing burden on clinicians. PMID:27185210

  6. A Literature Review of the Effect of Malaria on Stunting.

    PubMed

    Jackson, Bianca D; Black, Robert E

    2017-11-01

    Background: The current version of the Lives Saved Tool (LiST) maternal and child health impact modeling software does not include an effect of malaria on stunting. Objective: This literature review was undertaken to determine whether such a causal link should be included in the LiST model. Methods: The PubMed, Embase, and Scopus databases were searched by using broad search terms. The searches returned a total of 4281 documents. Twelve studies from among the retrieved documents were included in the review according to the inclusion and exclusion criteria. Results: There was mixed evidence for an effect of malaria on stunting among longitudinal observational studies, and none of the randomized controlled trials of malaria interventions found an effect of the interventions on stunting. Conclusions: There is insufficient evidence to include malaria as a determinant of stunting or an effect of malaria interventions on stunting in the LiST model. The paucity and heterogeneity of the available literature were a major limitation. In addition, the studies included in the review consistently fulfilled their ethical responsibility to treat children under observation for malaria, which may have interfered with the natural history of the disease and prevented any observable effect on stunting or linear growth. © 2017 American Society for Nutrition.

  7. Circulation of core collection monographs in an academic medical library.

    PubMed

    Schmidt, C M; Eckerman, N L

    2001-04-01

    Academic medical librarians responsible for monograph acquisition face a challenging task. From the plethora of medical monographs published each year, academic medical librarians must select those most useful to their patrons. Unfortunately, none of the selection tools available to medical librarians are specifically intended to assist academic librarians with medical monograph selection. The few short core collection lists that are available are intended for use in the small hospital or internal medicine department library. As these are the only selection tools available, however, many academic medical librarians spend considerable time reviewing these collection lists and place heavy emphasis on the acquisition of listed books. The study reported here was initiated to determine whether the circulation of listed books in an academic library justified the emphasis placed on the acquisition of these books. Circulation statistics for "listed" and "nonlisted" books in the hematology (WH) section of Indiana University School of Medicine's Ruth Lilly Medical Library were studied. The average circulation figures for listed books were nearly two times as high as the corresponding figures for the WH books in general. These data support the policies of those academic medical libraries that place a high priority on collection of listed books.

  8. [Proposal and preliminary validation of a check-list for the assessment of occupational exposure to repetitive movements of the upper lims].

    PubMed

    Colombini, D; Occhipinti, E; Cairoli, S; Baracco, A

    2000-01-01

    Over the last few years the Authors developed and implemented, a specific check-list for a "rapid" assessment of occupational exposure to repetitive movements and exertion of the upper limbs, after verifying the lack of such a tool which also had to be coherent with the latest data in the specialized literature. The check-list model and the relevant application procedures are presented and discussed. The check-list was applied by trained factory technicians in 46 different working tasks where the OCRA method previously proposed by the Authors was also applied by independent observers. Since 46 pairs of observation data were available (OCRA index and check-list score) it was possible to verify, via parametric and nonparametric statistical tests, the level of association between the two variables and to find the best simple regression function (exponential in this case) of the OCRA index from the check-list score. By means of this function, which was highly significant (R2 = 0.98, p < 0.0000), the values of the check-list score which better corresponded to the critical values (for exposure assessment) of the OCRA index looked for. The following correspondance values between OCRA Index and check-list were then established with a view to classifying exposure levels. The check-list "critical" scores were established considering the need for obtaining, in borderline cases, a potential effect of overestimation of the exposure level. On the basis of practical application experience and the preliminary validation results, recommendations are made and the caution needed in the use of the check-list is suggested.

  9. 36 CFR 60.2 - Effects of listing under Federal law.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... administered as a planning tool. Federal agencies undertaking a project having an effect on a listed or... buildings may benefit from the investment tax credit provisions of the Revenue Act of 1978. The Economic... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Effects of listing under...

  10. A clinically driven variant prioritization framework outperforms purely computational approaches for the diagnostic analysis of singleton WES data.

    PubMed

    Stark, Zornitza; Dashnow, Harriet; Lunke, Sebastian; Tan, Tiong Y; Yeung, Alison; Sadedin, Simon; Thorne, Natalie; Macciocca, Ivan; Gaff, Clara; Oshlack, Alicia; White, Susan M; James, Paul A

    2017-11-01

    Rapid identification of clinically significant variants is key to the successful application of next generation sequencing technologies in clinical practice. The Melbourne Genomics Health Alliance (MGHA) variant prioritization framework employs a gene prioritization index based on clinician-generated a priori gene lists, and a variant prioritization index (VPI) based on rarity, conservation and protein effect. We used data from 80 patients who underwent singleton whole exome sequencing (WES) to test the ability of the framework to rank causative variants highly, and compared it against the performance of other gene and variant prioritization tools. Causative variants were identified in 59 of the patients. Using the MGHA prioritization framework the average rank of the causative variant was 2.24, with 76% ranked as the top priority variant, and 90% ranked within the top five. Using clinician-generated gene lists resulted in ranking causative variants an average of 8.2 positions higher than prioritization based on variant properties alone. This clinically driven prioritization approach significantly outperformed purely computational tools, placing a greater proportion of causative variants top or in the top 5 (permutation P-value=0.001). Clinicians included 40 of the 49 WES diagnoses in their a priori list of differential diagnoses (81%). The lists generated by PhenoTips and Phenomizer contained 14 (29%) and 18 (37%) of these diagnoses respectively. These results highlight the benefits of clinically led variant prioritization in increasing the efficiency of singleton WES data analysis and have important implications for developing models for the funding and delivery of genomic services.

  11. Application of Linear Mixed-Effect Models for the Analysis of Exam Scores: Online Video Associated with Higher Scores for Undergraduate Students with Lower Grades

    ERIC Educational Resources Information Center

    Dupuis, Josee; Coutu, Josee; Laneuville, Odette

    2013-01-01

    In higher education, many of the new teaching interventions are introduced in the format of audio-visual files distributed through the Internet. A pedagogical tool consisting of questions listed as learning objectives and answers presented using online videos was designed as a supplement for a molecular biology course and made available to a large…

  12. Modeling Computer Communication Networks in a Realistic 3D Environment

    DTIC Science & Technology

    2010-03-01

    50 2. Comparison of visualization tools . . . . . . . . . . . . . . . . . 75 xi List of Abbreviations Abbreviation Page 2D two-dimensional...International Conference on, 77 –84, 2001. 20. National Defense and the Canadian Forces. “Joint Fires Support”. URL http: //www.cfd-cdf.forces.gc.ca/sites/ page ...UNLIMITED. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour

  13. American Society for Surgery of the Hand

    MedlinePlus

    ... Member Benefits Member News Member Listservs Job Listings Ethics & Professionalism Get Involved Volunteer Engagement Platform Local Journal ... Metrics News ICD-10 Tools Newsletters from ASSH Ethics & Professionalism Product Vendor List Educational Resources Courses 2018 ...

  14. Child Behavior Check List 1 1/2-5 as a Tool to Identify Toddlers with Autism Spectrum Disorders: A Case-Control Study

    ERIC Educational Resources Information Center

    Narzisi, Antonio; Calderoni, Sara; Maestro, Sandra; Calugi, Simona; Mottes, Emanuela; Muratori, Filippo

    2013-01-01

    Tools to identify toddlers with autism in clinical settings have been recently developed. This study evaluated the sensitivity and specificity of the Child Behavior Check List 1 1/2-5 (CBCL 1 1/2-5) in the detection of toddlers subsequently diagnosed with an Autism Spectrum Disorder (ASD), ages 18-36 months. The CBCL of 47 children with ASD were…

  15. Final Report: Demographic Tools for Climate Change and Environmental Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Neill, Brian

    2017-01-24

    This report summarizes work over the course of a three-year project (2012-2015, with one year no-cost extension to 2016). The full proposal detailed six tasks: Task 1: Population projection model Task 2: Household model Task 3: Spatial population model Task 4: Integrated model development Task 5: Population projections for Shared Socio-economic Pathways (SSPs) Task 6: Population exposure to climate extremes We report on all six tasks, provide details on papers that have appeared or been submitted as a result of this project, and list selected key presentations that have been made within the university community and at professional meetings.

  16. Satellite broadcasting system study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The study to develop a system model and computer program representative of broadcasting satellite systems employing community-type receiving terminals is reported. The program provides a user-oriented tool for evaluating performance/cost tradeoffs, synthesizing minimum cost systems for a given set of system requirements, and performing sensitivity analyses to identify critical parameters and technology. The performance/ costing philosophy and what is meant by a minimum cost system is shown graphically. Topics discussed include: main line control program, ground segment model, space segment model, cost models and launch vehicle selection. Several examples of minimum cost systems resulting from the computer program are presented. A listing of the computer program is also included.

  17. NASA Space Weather Center Services: Potential for Space Weather Research

    NASA Technical Reports Server (NTRS)

    Zheng, Yihua; Kuznetsova, Masha; Pulkkinen, Antti; Taktakishvili, A.; Mays, M. L.; Chulaki, A.; Lee, H.; Hesse, M.

    2012-01-01

    The NASA Space Weather Center's primary objective is to provide the latest space weather information and forecasting for NASA's robotic missions and its partners and to bring space weather knowledge to the public. At the same time, the tools and services it possesses can be invaluable for research purposes. Here we show how our archive and real-time modeling of space weather events can aid research in a variety of ways, with different classification criteria. We will list and discuss major CME events, major geomagnetic storms, and major SEP events that occurred during the years 2010 - 2012. Highlights of major tools/resources will be provided.

  18. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  19. GrayStarServer: Server-side Spectrum Synthesis with a Browser-based Client-side User Interface

    NASA Astrophysics Data System (ADS)

    Short, C. Ian

    2016-10-01

    We present GrayStarServer (GSS), a stellar atmospheric modeling and spectrum synthesis code of pedagogical accuracy that is accessible in any web browser on commonplace computational devices and that runs on a timescale of a few seconds. The addition of spectrum synthesis annotated with line identifications extends the functionality and pedagogical applicability of GSS beyond that of its predecessor, GrayStar3 (GS3). The spectrum synthesis is based on a line list acquired from the NIST atomic spectra database, and the GSS post-processing and user interface client allows the user to inspect the plain text ASCII version of the line list, as well as to apply macroscopic broadening. Unlike GS3, GSS carries out the physical modeling on the server side in Java, and communicates with the JavaScript and HTML client via an asynchronous HTTP request. We also describe other improvements beyond GS3 such as a more physical treatment of background opacity and atmospheric physics, the comparison of key results with those of the Phoenix code, and the use of the HTML < {canvas}> element for higher quality plotting and rendering of results. We also present LineListServer, a Java code for converting custom ASCII line lists in NIST format to the byte data type file format required by GSS so that users can prepare their own custom line lists. We propose a standard for marking up and packaging model atmosphere and spectrum synthesis output for data transmission and storage that will facilitate a web-based approach to stellar atmospheric modeling and spectrum synthesis. We describe some pedagogical demonstrations and exercises enabled by easily accessible, on-demand, responsive spectrum synthesis. GSS may serve as a research support tool by providing quick spectroscopic reconnaissance. GSS may be found at www.ap.smu.ca/~ishort/OpenStars/GrayStarServer/grayStarServer.html, and source tarballs for local installations of both GSS and LineListServer may be found at www.ap.smu.ca/~ishort/OpenStars/.

  20. Wigner Distribution Functions as a Tool for Studying Gas Phase Alkali Metal Plus Noble Gas Collisions

    DTIC Science & Technology

    2014-03-27

    ii List of Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii I...t, E) Wigner Distribution Function ii List of Acronyms Acronym Definition WDF Wigner Distribution Function PES Potential Energy Surface DPAL Diode

  1. Development and Validity Testing of an Arthritis Self-Management Assessment Tool.

    PubMed

    Oh, HyunSoo; Han, SunYoung; Kim, SooHyun; Seo, WhaSook

    Because of the chronic, progressive nature of arthritis and the substantial effects it has on quality of life, patients may benefit from self-management. However, no valid, reliable self-management assessment tool has been devised for patients with arthritis. This study was conducted to develop a comprehensive self-management assessment tool for patients with arthritis, that is, the Arthritis Self-Management Assessment Tool (ASMAT). To develop a list of qualified items corresponding to the conceptual definitions and attributes of arthritis self-management, a measurement model was established on the basis of theoretical and empirical foundations. Content validity testing was conducted to evaluate whether listed items were suitable for assessing arthritis self-management. Construct validity and reliability of the ASMAT were tested. Construct validity was examined using confirmatory factor analysis and nomological validity. The 32-item ASMAT was developed with a sample composed of patients in a clinic in South Korea. Content validity testing validated the 32 items, which comprised medical (10 items), behavioral (13 items), and psychoemotional (9 items) management subscales. Construct validity testing of the ASMAT showed that the 32 items properly corresponded with conceptual constructs of arthritis self-management, and were suitable for assessing self-management ability in patients with arthritis. Reliability was also well supported. The ASMAT devised in the present study may aid the evaluation of patient self-management ability and the effectiveness of self-management interventions. The authors believe the developed tool may also aid the identification of problems associated with the adoption of self-management practice, and thus improve symptom management, independence, and quality of life of patients with arthritis.

  2. Design Optimization Tool for Synthetic Jet Actuators Using Lumped Element Modeling

    NASA Technical Reports Server (NTRS)

    Gallas, Quentin; Sheplak, Mark; Cattafesta, Louis N., III; Gorton, Susan A. (Technical Monitor)

    2005-01-01

    The performance specifications of any actuator are quantified in terms of an exhaustive list of parameters such as bandwidth, output control authority, etc. Flow-control applications benefit from a known actuator frequency response function that relates the input voltage to the output property of interest (e.g., maximum velocity, volumetric flow rate, momentum flux, etc.). Clearly, the required performance metrics are application specific, and methods are needed to achieve the optimal design of these devices. Design and optimization studies have been conducted for piezoelectric cantilever-type flow control actuators, but the modeling issues are simpler compared to synthetic jets. Here, lumped element modeling (LEM) is combined with equivalent circuit representations to estimate the nonlinear dynamic response of a synthetic jet as a function of device dimensions, material properties, and external flow conditions. These models provide reasonable agreement between predicted and measured frequency response functions and thus are suitable for use as design tools. In this work, we have developed a Matlab-based design optimization tool for piezoelectric synthetic jet actuators based on the lumped element models mentioned above. Significant improvements were achieved by optimizing the piezoceramic diaphragm dimensions. Synthetic-jet actuators were fabricated and benchtop tested to fully document their behavior and validate a companion optimization effort. It is hoped that the tool developed from this investigation will assist in the design and deployment of these actuators.

  3. Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.

    PubMed

    Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.

  4. Medication Reconciliation: Work Domain Ontology, Prototype Development, and a Predictive Model

    PubMed Central

    Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R.

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System’s and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load. PMID:22195146

  5. Modelling the cost of community interventions to reduce child mortality in South Africa using the Lives Saved Tool (LiST).

    PubMed

    Nkonki, Lungiswa Ll; Chola, Lumbwe L; Tugendhaft, Aviva A; Hofman, Karen K

    2017-08-28

    To estimate the costs and impact on reducing child mortality of scaling up interventions that can be delivered by community health workers at community level from a provider's perspective. In this study, we used the Lives Saved Tool (LiST), a module in the spectrum software. Within the spectrum software, LiST interacts with other modules, the AIDS Impact Module, Family Planning Module and Demography Projections Module (Dem Proj), to model the impact of more than 60 interventions that affect cause-specific mortality. DemProj Based on National South African Data. A total of nine interventions namely, breastfeeding promotion, complementary feeding, vitamin supplementation, hand washing with soap, hygienic disposal of children's stools, oral rehydration solution, oral antibiotics for the treatment of pneumonia, therapeutic feeding for wasting and treatment for moderate malnutrition. Reducing child mortality. A total of 9 interventions can prevent 8891 deaths by 2030. Hand washing with soap (21%) accounts for the highest number of deaths prevented, followed by therapeutic feeding (19%) and oral rehydration therapy (16%). The top 5 interventions account for 77% of all deaths prevented. At scale, an estimated cost of US$169.5 million (US$3 per capita) per year will be required in community health worker costs. The use of community health workers offers enormous opportunities for saving lives. These programmes require appropriate financial investments. Findings from this study show what can be achieved if concerted effort is channelled towards the identified set of life-saving interventions. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Copper Indium Gallium Diselenide Cluster Tool | Photovoltaic Research |

    Science.gov Websites

    -mobile unit The figure shows where chambers-numbered in the list above-are physically located on the laboratory space. Samples from the CIGS cluster tool can be transported to these other tools using a mobile

  7. Distributed Object Oriented Programming

    DTIC Science & Technology

    1990-02-01

    of the object oriented model of computation. Therefore, object oriented programming can provide the programmer with good conceptual tools to divide his...LABOR SALES-COMMISSION). The symbol + refers to the addition function and takes any number of numeric arguments. The third subtype of list forms is the...2) ’(:SEND-DONE) (SEWF (AREF OBJECT-i1-MESSAGES-SENT 2) ’(PROGN (FORMAT T "-s methd completely executed instr-ptr -s-V NAME %INSTR-PTR%) (INCF

  8. Inducing Multilingual Text Analysis Tools via Robust Projection across Aligned Corpora

    DTIC Science & Technology

    2001-01-01

    monolingual dictionary - derived list of canonical roots would resolve ambiguity re- garding which is the appropriate target. � Many of the errors are...system and set of algorithms for automati- cally inducing stand-alone monolingual part-of-speech taggers, base noun-phrase bracketers, named-entity...corpora has tended to focus on their use in translation model training for MT rather than on monolingual applications. One exception is bilin- gual parsing

  9. Linear versus quadratic portfolio optimization model with transaction cost

    NASA Astrophysics Data System (ADS)

    Razak, Norhidayah Bt Ab; Kamil, Karmila Hanim; Elias, Siti Masitah

    2014-06-01

    Optimization model is introduced to become one of the decision making tools in investment. Hence, it is always a big challenge for investors to select the best model that could fulfill their goal in investment with respect to risk and return. In this paper we aims to discuss and compare the portfolio allocation and performance generated by quadratic and linear portfolio optimization models namely of Markowitz and Maximin model respectively. The application of these models has been proven to be significant and popular among others. However transaction cost has been debated as one of the important aspects that should be considered for portfolio reallocation as portfolio return could be significantly reduced when transaction cost is taken into consideration. Therefore, recognizing the importance to consider transaction cost value when calculating portfolio' return, we formulate this paper by using data from Shariah compliant securities listed in Bursa Malaysia. It is expected that, results from this paper will effectively justify the advantage of one model to another and shed some lights in quest to find the best decision making tools in investment for individual investors.

  10. Indentured Parts List Maintenance and Part Assembly Capture Tool - IMPACT

    NASA Technical Reports Server (NTRS)

    Jain, Bobby; Morris, Jill; Sharpe, Kelly

    2004-01-01

    Johnson Space Center's (JSC's) indentured parts list (IPL) maintenance and parts assembly capture tool (IMPACT) is an easy-to-use graphical interface for viewing and maintaining the complex assembly hierarchies of large databases. IMPACT, already in use at JSC to support the International Space Station (ISS), queries, updates, modifies, and views data in IPL and associated resource data, functions that it can also perform, with modification, for any large commercial database. By enabling its users to efficiently view and manipulate IPL hierarchical data, IMPACT performs a function unlike that of any other tool. Through IMPACT, users will achieve results quickly, efficiently, and cost effectively.

  11. Circulation of core collection monographs in an academic medical library

    PubMed Central

    Schmidt, Cynthia M.; Eckerman, Nancy L.

    2001-01-01

    Academic medical librarians responsible for monograph acquisition face a challenging task. From the plethora of medical monographs published each year, academic medical librarians must select those most useful to their patrons. Unfortunately, none of the selection tools available to medical librarians are specifically intended to assist academic librarians with medical monograph selection. The few short core collection lists that are available are intended for use in the small hospital or internal medicine department library. As these are the only selection tools available, however, many academic medical librarians spend considerable time reviewing these collection lists and place heavy emphasis on the acquisition of listed books. The study reported here was initiated to determine whether the circulation of listed books in an academic library justified the emphasis placed on the acquisition of these books. Circulation statistics for “listed” and “nonlisted” books in the hematology (WH) section of Indiana University School of Medicine's Ruth Lilly Medical Library were studied. The average circulation figures for listed books were nearly two times as high as the corresponding figures for the WH books in general. These data support the policies of those academic medical libraries that place a high priority on collection of listed books. PMID:11337947

  12. Development of a comprehensive list of criteria for evaluating consumer education materials on colorectal cancer screening

    PubMed Central

    2013-01-01

    Background Appropriate patient information materials may support the consumer’s decision to attend or not to attend colorectal cancer (CRC) screening tests (fecal occult blood test and screening colonoscopy). The aim of this study was to develop a list of criteria to assess whether written health information materials on CRC screening provide balanced, unbiased, quantified, understandable, and evidence-based health information (EBHI) about CRC and CRC screening. Methods The list of criteria was developed based on recommendations and assessment tools for health information in the following steps: (1) Systematic literature search in 13 electronic databases (search period: 2000–2010) and completed by an Internet search (2) Extraction of identified criteria (3) Grouping of criteria into categories and domains (4) Compilation of a manual of adequate answers derived from systematic reviews and S3 guidelines (5) Review by external experts (6) Modification (7) Final discussion with external experts. Results Thirty-one publications on health information tools and recommendations were identified. The final list of criteria includes a total of 230 single criteria in three generic domains (formal issues, presentation and understandability, and neutrality and balance) and one CRC-specific domain. A multi-dimensional rating approach was used whenever appropriate (e.g., rating for the presence, correctness, presentation and level of evidence of information). Free text input was allowed to ensure the transparency of assessment. The answer manual proved to be essential to the rating process. Quantitative analyses can be made depending on the level and dimensions of criteria. Conclusions This comprehensive list of criteria clearly has a wider range of evaluation than previous assessment tools. It is not intended as a final quality assessment tool, but as a first step toward thorough evaluation of specific information materials for their adherence to EBHI requirements. This criteria list may also be used to revise leaflets and to develop evidence-based health information on CRC screening. After adjustment for different procedure-specific criteria, the list of criteria can also be applied to other cancer screening procedures. PMID:24028691

  13. Healthcare provider attitudes towards the problem list in an electronic health record: a mixed-methods qualitative study

    PubMed Central

    2012-01-01

    Background The problem list is a key part of the electronic health record (EHR) that allows practitioners to see a patient’s diagnoses and health issues. Yet, as the content of the problem list largely represents the subjective decisions of those who edit it, patients’ problem lists are often unreliable when shared across practitioners. The lack of standards for how the problem list is compiled in the EHR limits its effectiveness in improving patient care, particularly as a resource for clinical decision support and population management tools. The purpose of this study is to discover practitioner opinions towards the problem list and the logic behind their decisions during clinical situations. Materials and methods An observational cross-sectional study was conducted at two major Boston teaching hospitals. Practitioners’ opinions about the problem list were collected through both in-person interviews and an online questionnaire. Questions were framed using vignettes of clinical scenarios asking practitioners about their preferred actions towards the problem list. Results These data confirmed prior research that practitioners differ in their opinions over managing the problem list, but in most responses to a questionnaire, there was a common approach among the relative majority of respondents. Further, basic demographic characteristics of providers (age, medical experience, etc.) did not appear to strongly affect attitudes towards the problem list. Conclusion The results supported the premise that policies and EHR tools are needed to bring about a common approach. Further, the findings helped identify what issues might benefit the most from a defined policy and the level of restriction a problem list policy should place on the addition of different types of information. PMID:23140312

  14. A conceptual framework for the collection of food products in a Total Diet Study.

    PubMed

    Turrini, Aida; Lombardi-Boccia, Ginevra; Aureli, Federica; Cubadda, Francesco; D'Addezio, Laura; D'Amato, Marilena; D'Evoli, Laura; Darnerud, PerOla; Devlin, Niamh; Dias, Maria Graça; Jurković, Marina; Kelleher, Cecily; Le Donne, Cinzia; López Esteban, Maite; Lucarini, Massimo; Martinez Burgos, Maria Alba; Martínez-Victoria, Emilio; McNulty, Breige; Mistura, Lorenza; Nugent, Anne; Oktay Basegmez, Hatice Imge; Oliveira, Luisa; Ozer, Hayrettin; Perelló, Gemma; Pite, Marina; Presser, Karl; Sokolić, Darja; Vasco, Elsa; Volatier, Jean-Luc

    2018-02-01

    A total diet study (TDS) provides representative and realistic data for assessing the dietary intake of chemicals, such as contaminants and residues, and nutrients, at a population level. Reproducing the diet through collection of customarily consumed foods and their preparation as habitually eaten is crucial to ensure representativeness, i.e., all relevant foods are included and all potential dietary sources of the substances investigated are captured. Having this in mind, a conceptual framework for building a relevant food-shopping list was developed as a research task in the European Union's 7th Framework Program project, 'Total Diet Study Exposure' (TDS-Exposure), aimed at standardising methods for food sampling, analyses, exposure assessment calculations and modelling, priority foods, and selection of chemical contaminants. A stepwise approach following the knowledge translation (KT) model for concept analysis is proposed to set up a general protocol for the collection of food products in a TDS in terms of steps (characterisation of the food list, development of the food-shopping list, food products collection) and pillars (background documentation, procedures, and tools). A simple model for structuring the information in a way to support the implementation of the process, by presenting relevant datasets, forms to store inherent information, and folders to record the results is also proposed. Reproducibility of the process and possibility to exploit the gathered information are two main features of such a system for future applications.

  15. GlycoWorkbench: a tool for the computer-assisted annotation of mass spectra of glycans.

    PubMed

    Ceroni, Alessio; Maass, Kai; Geyer, Hildegard; Geyer, Rudolf; Dell, Anne; Haslam, Stuart M

    2008-04-01

    Mass spectrometry is the main analytical technique currently used to address the challenges of glycomics as it offers unrivalled levels of sensitivity and the ability to handle complex mixtures of different glycan variations. Determination of glycan structures from analysis of MS data is a major bottleneck in high-throughput glycomics projects, and robust solutions to this problem are of critical importance. However, all the approaches currently available have inherent restrictions to the type of glycans they can identify, and none of them have proved to be a definitive tool for glycomics. GlycoWorkbench is a software tool developed by the EUROCarbDB initiative to assist the manual interpretation of MS data. The main task of GlycoWorkbench is to evaluate a set of structures proposed by the user by matching the corresponding theoretical list of fragment masses against the list of peaks derived from the spectrum. The tool provides an easy to use graphical interface, a comprehensive and increasing set of structural constituents, an exhaustive collection of fragmentation types, and a broad list of annotation options. The aim of GlycoWorkbench is to offer complete support for the routine interpretation of MS data. The software is available for download from: http://www.eurocarbdb.org/applications/ms-tools.

  16. MaRiMba: a software application for spectral library-based MRM transition list assembly.

    PubMed

    Sherwood, Carly A; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B

    2009-10-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture.

  17. The Knowledge Structure in Amarakośa

    NASA Astrophysics Data System (ADS)

    Nair, Sivaja S.; Kulkarni, Amba

    Amarakośa is the most celebrated and authoritative ancient thesaurus of Sanskrit. It is one of the books which an Indian child learning through Indian traditional educational system memorizes as early as his first year of formal learning. Though it appears as a linear list of words, close inspection of it shows a rich organisation of words expressing various relations a word bears with other words. Thus when a child studies Amarakośa further, the linear list of words unfolds into a knowledge web. In this paper we describe our effort to make the implicit knowledge in Amarakośa explicit. A model for storing such structure is discussed and a web tool is described that answers the queries by reconstructing the links among words from the structured tables dynamically.

  18. An English-French-German-Spanish Word Frequency Dictionary: A Correlation of the First Six Thousand Words in Four Single-Language Frequency Lists.

    ERIC Educational Resources Information Center

    Eaton, Helen S., Comp.

    This semantic frequency list for English, French, German, and Spanish correlates 6,474 concepts represented by individual words in an order of diminishing occurrence. Designed as a research tool, the work is segmented into seven comparative "Thousand Concepts" lists with 115 sectional subdivisions, each of which begins with the key English word…

  19. Building Scientific Data's list of recommended data repositories

    NASA Astrophysics Data System (ADS)

    Hufton, A. L.; Khodiyar, V.; Hrynaszkiewicz, I.

    2016-12-01

    When Scientific Data launched in 2014 we provided our authors with a list of recommended data repositories to help them identify data hosting options that were likely to meet the journal's requirements. This list has grown in size and scope, and is now a central resource for authors across the Nature-titled journals. It has also been used in the development of data deposition policies and recommended repository lists across Springer Nature and at other publishers. Each new addition to the list is assessed according to a series of criteria that emphasize the stability of the resource, its commitment to principles of open science and its implementation of relevant community standards and reporting guidelines. A preference is expressed for repositories that issue digital object identifiers (DOIs) through the DataCite system and that share data under the Creative Commons CC0 waiver. Scientific Data currently lists fourteen repositories that focus on specific areas within the Earth and environmental sciences, as well as the broad scope repositories, Dryad and figshare. Readers can browse and filter datasets published at the journal by the host repository using ISA-explorer, a demo tool built by the ISA-tools team at Oxford University1. We believe that well-maintained lists like this one help publishers build a network of trust with community data repositories and provide an important complement to more comprehensive data repository indices and more formal certification efforts. In parallel, Scientific Data has also improved its policies to better support submissions from authors using institutional and project-specific repositories, without requiring each to apply for listing individually. Online resources Journal homepage: http://www.nature.com/scientificdata Data repository criteria: http://www.nature.com/sdata/policies/data-policies#repo-criteria Recommended data repositories: http://www.nature.com/sdata/policies/repositories Archived copies of the list: https://dx.doi.org/10.6084/m9.figshare.1434640.v6 Reference Gonzalez-Beltran, A. ISA-explorer: A demo tool for discovering and exploring Scientific Data's ISA-tab metadata. Scientific Data Updates http://blogs.nature.com/scientificdata/2015/12/17/isa-explorer/ (2015).

  20. Variable stars around selected open clusters in the VVV area: Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Medina, Nicolas; Borissova, Jura; Bayo, Amelia; Kurtev, Radostin; Lucas, Philip

    2017-09-01

    Time-varying phenomena are one of the most substantial sources of astrophysical information, and led to many fundamental discoveries in modern astronomy. We have developed an automated tool to search and analyze variable sources in the near infrared Ks band, using the data from the Vista Variables in the Vía Láctea (VVV) ESO Public Survey ([5, 8]). One of our main goals is to investigate the Young Stellar Objects (YSOs) in the Galactic star forming regions, looking for: Variability. New pre-main sequence star clusters. Here we present the newly discovered YSOs within some selected stellar clusters in our Galaxy.

  1. From data to function: functional modeling of poultry genomics data.

    PubMed

    McCarthy, F M; Lyons, E

    2013-09-01

    One of the challenges of functional genomics is to create a better understanding of the biological system being studied so that the data produced are leveraged to provide gains for agriculture, human health, and the environment. Functional modeling enables researchers to make sense of these data as it reframes a long list of genes or gene products (mRNA, ncRNA, and proteins) by grouping based upon function, be it individual molecular functions or interactions between these molecules or broader biological processes, including metabolic and signaling pathways. However, poultry researchers have been hampered by a lack of functional annotation data, tools, and training to use these data and tools. Moreover, this lack is becoming more critical as new sequencing technologies enable us to generate data not only for an increasingly diverse range of species but also individual genomes and populations of individuals. We discuss the impact of these new sequencing technologies on poultry research, with a specific focus on what functional modeling resources are available for poultry researchers. We also describe key strategies for researchers who wish to functionally model their own data, providing background information about functional modeling approaches, the data and tools to support these approaches, and the strengths and limitations of each. Specifically, we describe methods for functional analysis using Gene Ontology (GO) functional summaries, functional enrichment analysis, and pathways and network modeling. As annotation efforts begin to provide the fundamental data that underpin poultry functional modeling (such as improved gene identification, standardized gene nomenclature, temporal and spatial expression data and gene product function), tool developers are incorporating these data into new and existing tools that are used for functional modeling, and cyberinfrastructure is being developed to provide the necessary extendibility and scalability for storing and analyzing these data. This process will support the efforts of poultry researchers to make sense of their functional genomics data sets, and we provide here a starting point for researchers who wish to take advantage of these tools.

  2. Calibration of the Software Architecture Sizing and Estimation Tool (SASET).

    DTIC Science & Technology

    1995-09-01

    model is of more value than the uncalibrated one. Also, as will be discussed in Chapters 3 and 4, there are quite a few manual (and undocumented) steps...complexity, normalized effective size, and normalized effort. One other field ("development phases included") was extracted manually since it was not listed...Bowden, R.G., Cheadle, W.G., & Ratliff, R.W. SASET 3.0 Technical Reference Manual . Publication S-3730-93-2. Denver: Martin Marietta Astronautics

  3. GIDEP Batching Tool

    NASA Technical Reports Server (NTRS)

    Fong, Danny; Odell,Dorice; Barry, Peter; Abrahamian, Tomik

    2008-01-01

    This software provides internal, automated search mechanics of GIDEP (Government- Industry Data Exchange Program) Alert data imported from the GIDEP government Web site. The batching tool allows the import of a single parts list in tab-delimited text format into the local JPL GIDEP database. Delimiters from every part number are removed. The original part numbers with delimiters are compared, as well as the newly generated list without the delimiters. The two lists run against the GIDEP imports, and output any matches. This feature only works with Netscape 2.0 or greater, or Internet Explorer 4.0 or greater. The user selects the browser button to choose a text file to import. When the submit button is pressed, this script will import alerts from the text file into the local JPL GIDEP database. This batch tool provides complete in-house control over exported material and data for automated batch match abilities. The batching tool has the ability to match capabilities of the parts list to tables, and yields results that aid further research and analysis. This provides more control over GIDEP information for metrics and reports information not provided by the government site. This software yields results quickly and gives more control over external data from the government site in order to generate other reports not available from the external source. There is enough space to store years of data. The program relates to risk identification and management with regard to projects and GIDEP alert information encompassing flight parts for space exploration.

  4. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  5. List of Publications, Tools and Resources

    EPA Pesticide Factsheets

    View free technical, promotional, and informational tools and services to assist with the development of landfill gas energy projects. This page contains the publications, brochures, and fact sheets referenced throughout the site.

  6. Detecting extinction risk from climate change by IUCN Red List criteria.

    PubMed

    Keith, David A; Mahony, Michael; Hines, Harry; Elith, Jane; Regan, Tracey J; Baumgartner, John B; Hunter, David; Heard, Geoffrey W; Mitchell, Nicola J; Parris, Kirsten M; Penman, Trent; Scheele, Ben; Simpson, Christopher C; Tingley, Reid; Tracy, Christopher R; West, Matt; Akçakaya, H Resit

    2014-06-01

    Anthropogenic climate change is a key threat to global biodiversity. To inform strategic actions aimed at conserving biodiversity as climate changes, conservation planners need early warning of the risks faced by different species. The IUCN Red List criteria for threatened species are widely acknowledged as useful risk assessment tools for informing conservation under constraints imposed by limited data. However, doubts have been expressed about the ability of the criteria to detect risks imposed by potentially slow-acting threats such as climate change, particularly because criteria addressing rates of population decline are assessed over time scales as short as 10 years. We used spatially explicit stochastic population models and dynamic species distribution models projected to future climates to determine how long before extinction a species would become eligible for listing as threatened based on the IUCN Red List criteria. We focused on a short-lived frog species (Assa darlingtoni) chosen specifically to represent potential weaknesses in the criteria to allow detailed consideration of the analytical issues and to develop an approach for wider application. The criteria were more sensitive to climate change than previously anticipated; lead times between initial listing in a threatened category and predicted extinction varied from 40 to 80 years, depending on data availability. We attributed this sensitivity primarily to the ensemble properties of the criteria that assess contrasting symptoms of extinction risk. Nevertheless, we recommend the robustness of the criteria warrants further investigation across species with contrasting life histories and patterns of decline. The adequacy of these lead times for early warning depends on practicalities of environmental policy and management, bureaucratic or political inertia, and the anticipated species response times to management actions. © 2014 Society for Conservation Biology.

  7. Scale Insects, edition 2, a tool for the identification of potential pest scales at U.S.A. ports-of-entry (Hemiptera, Sternorrhyncha, Coccoidea)

    PubMed Central

    Miller, Douglass R.; Rung, Alessandra; Parikh, Grishma

    2014-01-01

    Abstract We provide a general overview of features and technical specifications of an online, interactive tool for the identification of scale insects of concern to the U.S.A. ports-of-entry. Full lists of terminal taxa included in the keys (of which there are four), a list of features used in them, and a discussion of the structure of the tool are provided. We also briefly discuss the advantages of interactive keys for the identification of potential scale insect pests. The interactive key is freely accessible on http://idtools.org/id/scales/index.php PMID:25152668

  8. Fundamentals of Welding. Teacher Edition.

    ERIC Educational Resources Information Center

    Fortney, Clarence; And Others

    These instructional materials assist teachers in improving instruction on the fundamentals of welding. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and 27 references. Seven units of…

  9. Pending studies at hospital discharge: a pre-post analysis of an electronic medical record tool to improve communication at hospital discharge.

    PubMed

    Kantor, Molly A; Evans, Kambria H; Shieh, Lisa

    2015-03-01

    Achieving safe transitions of care at hospital discharge requires accurate and timely communication. Both the presence of and follow-up plan for diagnostic studies that are pending at hospital discharge are expected to be accurately conveyed during these transitions, but this remains a challenge. To determine the prevalence, characteristics, and communication of studies pending at hospital discharge before and after the implementation of an electronic medical record (EMR) tool that automatically generates a list of pending studies. Pre-post analysis. 260 consecutive patients discharged from inpatient general medicine services from July to August 2013. Development of an EMR-based tool that automatically generates a list of studies pending at discharge. The main outcomes were prevalence and characteristics of pending studies and communication of studies pending at hospital discharge. We also surveyed internal medicine house staff on their attitudes about communication of pending studies. Pre-intervention, 70% of patients had at least one pending study at discharge, but only 18% of these were communicated in the discharge summary. Most studies were microbiology cultures (68%), laboratory studies (16%), or microbiology serologies (10%). The majority of study results were ultimately normal (83%), but 9% were newly abnormal. Post-intervention, communication of studies pending increased to 43% (p < 0.001). Most patients are discharged from the hospital with pending studies, but in usual practice, the presence of these studies has rarely been communicated to outpatient providers in the discharge summary. Communication significantly increased with the implementation of an EMR-based tool that automatically generated a list of pending studies from the EMR and allowed users to import this list into the discharge summary. This is the first study to our knowledge to introduce an automated EMR-based tool to communicate pending studies.

  10. Using the missed opportunity tool as an application of the Lives Saved Tool (LiST) for intervention prioritization.

    PubMed

    Tam, Yvonne; Pearson, Luwei

    2017-11-07

    The Missed Opportunity tool was developed as an application in the Lives Saved Tool (LiST) to allow users to quickly compare the relative impact of interventions. Global Financing Facility (GFF) investment cases have been identified as a potential application of the Missed Opportunity analyses in Democratic Republic of the Congo (DRC), Ethiopia, Kenya, and Tanzania, to use 'lives saved' as a normative factor to set priorities. The Missed Opportunity analysis draws on data and methods in LiST to project maternal, stillbirth, and child deaths averted based on changes in interventions' coverage. Coverage of each individual intervention in LiST was automated to be scaled up from current coverage to 90% in the next year, to simulate a scenario where almost every mother and child receive proven interventions that they need. The main outcome of the Missed Opportunity analysis is deaths averted due to each intervention. When reducing unmet need for contraception is included in the analysis, it ranks as the top missed opportunity across the four countries. When it is not included in the analysis, top interventions with the most total deaths averted are hospital-based interventions such as labor and delivery management in the CEmOC and BEmOC level, and full treatment and supportive care for premature babies, and for sepsis/pneumonia. The Missed Opportunity tool can be used to provide a quick, first look at missed opportunities in a country or geographic region, and help identify interventions for prioritization. While it is a useful advocate for evidence-based priority setting, decision makers need to consider other factors that influence decision making, and also discuss how to implement, deliver, and sustain programs to achieve high coverage.

  11. Grohar: Automated Visualization of Genome-Scale Metabolic Models and Their Pathways.

    PubMed

    Moškon, Miha; Zimic, Nikolaj; Mraz, Miha

    2018-05-01

    Genome-scale metabolic models (GEMs) have become a powerful tool for the investigation of the entire metabolism of the organism in silico. These models are, however, often extremely hard to reconstruct and also difficult to apply to the selected problem. Visualization of the GEM allows us to easier comprehend the model, to perform its graphical analysis, to find and correct the faulty relations, to identify the parts of the system with a designated function, etc. Even though several approaches for the automatic visualization of GEMs have been proposed, metabolic maps are still manually drawn or at least require large amount of manual curation. We present Grohar, a computational tool for automatic identification and visualization of GEM (sub)networks and their metabolic fluxes. These (sub)networks can be specified directly by listing the metabolites of interest or indirectly by providing reference metabolic pathways from different sources, such as KEGG, SBML, or Matlab file. These pathways are identified within the GEM using three different pathway alignment algorithms. Grohar also supports the visualization of the model adjustments (e.g., activation or inhibition of metabolic reactions) after perturbations are induced.

  12. Community-based benchmarking of the CMIP DECK experiments

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2015-12-01

    A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.

  13. Title III List of Lists -- Data Tool

    EPA Pesticide Factsheets

    This list was prepared to help firms handling chemicals determine whether they need to submit reports under sections 302, 304, or 313 of the Emergency Planning and Community Right-to-Know Act of 1986 (EPCRA) and, for a specific chemical, what reports may need to be submitted. It will also help firms determine whether they will be subject to accident prevention regulations under Clean Air Act (CAA) section 112(r).

  14. FRAMEWORK FOR RESPONSIBLE DECISION-MAKING (FRED): A TOOL FOR ENVIRONMENTALLY PREFERABLE PRODUCTS

    EPA Science Inventory

    In support of the Environmentally Preferable Purchasing Program of the USEPA, a decision-making tool based on life cycle assessment has been developed. This tool, the Framework for Responsible Environmental Decision-making or FRED, streamlines LCA by choosing a minimum list of im...

  15. Using the WHO Essential Medicines List to Assess the Appropriateness of Insurance Coverage Decisions: A Case Study of the Croatian National Medicine Reimbursement List

    PubMed Central

    Jeličić Kadić, Antonia; Žanić, Maja; Škaričić, Nataša; Marušić, Ana

    2014-01-01

    Purpose To investigate the use of the WHO EML as a tool with which to evaluate the evidence base for the medicines on the national insurance coverage list of the Croatian Institute of Health Insurance (CIHI). Methods Medicines from 9 ATC categories with highest expenditures from 2012 CIHI Basic List (n = 509) were compared with 2011 WHO EML for adults (n = 359). For medicines with specific indication listed only in CIHI Basic List we assessed whether there was evidence in Cochrane Database of Systematic Reviews questioning their efficacy and safety. Results The two lists shared 188 medicines (52.4% of WHO EML and 32.0% of CIHI list). CIHI Basic List had 254 medicines and 33 combinations of these medicines which were not on the WHO EML, plus 14 medicines rejected and 20 deleted from WHO EML by its Evaluation Committee. For deleted medicines, we could obtain data that showed 2,965,378 prescriptions issued to 617,684 insured patients, and the cost of approximately € 41.2 million for 2012 and the first half of 2013, when the CIHI Basic List was in effect. For CIHI List-only medicines with a specific indication (n = 164 or 57.1% of the analyzed set), fewer benefits or more serious side-effects than other medicines were found for 17 (10.4%) and not enough evidence for recommendations for specific indication for 21 (12.8%) medicines in Cochrane systematic reviews. Conclusions National health care policy should use high-quality evidence in deciding on adding new medicines and reassessing those already present on national medicines lists, in order to rationalize expenditures and ensure wider and better access to medicines. The WHO EML and recommendations from its Evaluation Committee may be useful tools in this quality assurance process. PMID:25337860

  16. Using the WHO essential medicines list to assess the appropriateness of insurance coverage decisions: a case study of the Croatian national medicine reimbursement list.

    PubMed

    Jeličić Kadić, Antonia; Žanić, Maja; Škaričić, Nataša; Marušić, Ana

    2014-01-01

    To investigate the use of the WHO EML as a tool with which to evaluate the evidence base for the medicines on the national insurance coverage list of the Croatian Institute of Health Insurance (CIHI). Medicines from 9 ATC categories with highest expenditures from 2012 CIHI Basic List (n = 509) were compared with 2011 WHO EML for adults (n = 359). For medicines with specific indication listed only in CIHI Basic List we assessed whether there was evidence in Cochrane Database of Systematic Reviews questioning their efficacy and safety. The two lists shared 188 medicines (52.4% of WHO EML and 32.0% of CIHI list). CIHI Basic List had 254 medicines and 33 combinations of these medicines which were not on the WHO EML, plus 14 medicines rejected and 20 deleted from WHO EML by its Evaluation Committee. For deleted medicines, we could obtain data that showed 2,965,378 prescriptions issued to 617,684 insured patients, and the cost of approximately € 41.2 million for 2012 and the first half of 2013, when the CIHI Basic List was in effect. For CIHI List-only medicines with a specific indication (n = 164 or 57.1% of the analyzed set), fewer benefits or more serious side-effects than other medicines were found for 17 (10.4%) and not enough evidence for recommendations for specific indication for 21 (12.8%) medicines in Cochrane systematic reviews. National health care policy should use high-quality evidence in deciding on adding new medicines and reassessing those already present on national medicines lists, in order to rationalize expenditures and ensure wider and better access to medicines. The WHO EML and recommendations from its Evaluation Committee may be useful tools in this quality assurance process.

  17. 48 CFR 225.7001 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...

  18. 48 CFR 225.7001 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...

  19. 48 CFR 225.7001 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...

  20. 48 CFR 225.7001 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...

  1. 48 CFR 225.7001 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Mooring Chain. (c) “End product” is defined in the clause at 252.225-7012, Preference for Certain Domestic Commodities. (d) Hand or measuring tools means those tools listed in Federal supply classifications 51 and 52...

  2. Brownfields Environmental Insurance and Risk Management Tools Glossary of Terms

    EPA Pesticide Factsheets

    This document provides a list of terms that are typically used by the environmental insurance industry, transactional specialists, and other parties involved in using environmental insurance or risk management tools.

  3. Idea Notebook: Wilderness Food Planning in the Computer Age.

    ERIC Educational Resources Information Center

    Drury, Jack K.

    1986-01-01

    Explains the use of a computer as a planning and teaching tool in wilderness trip food planning. Details use of master food list and spreadsheet software such as VisiCalc to provide shopping lists for food purchasing, cost analysis, and diet analysis. (NEC)

  4. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.

  5. Locality-Conscious Lock-Free Linked Lists

    NASA Astrophysics Data System (ADS)

    Braginsky, Anastasia; Petrank, Erez

    We extend state-of-the-art lock-free linked lists by building linked lists with special care for locality of traversals. These linked lists are built of sequences of entries that reside on consecutive chunks of memory. When traversing such lists, subsequent entries typically reside on the same chunk and are thus close to each other, e.g., in same cache line or on the same virtual memory page. Such cache-conscious implementations of linked lists are frequently used in practice, but making them lock-free requires care. The basic component of this construction is a chunk of entries in the list that maintains a minimum and a maximum number of entries. This basic chunk component is an interesting tool on its own and may be used to build other lock-free data structures as well.

  6. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  7. List-mode reconstruction for the Biograph mCT with physics modeling and event-by-event motion correction

    NASA Astrophysics Data System (ADS)

    Jin, Xiao; Chan, Chung; Mulnix, Tim; Panin, Vladimir; Casey, Michael E.; Liu, Chi; Carson, Richard E.

    2013-08-01

    Whole-body PET/CT scanners are important clinical and research tools to study tracer distribution throughout the body. In whole-body studies, respiratory motion results in image artifacts. We have previously demonstrated for brain imaging that, when provided with accurate motion data, event-by-event correction has better accuracy than frame-based methods. Therefore, the goal of this work was to develop a list-mode reconstruction with novel physics modeling for the Siemens Biograph mCT with event-by-event motion correction, based on the MOLAR platform (Motion-compensation OSEM List-mode Algorithm for Resolution-Recovery Reconstruction). Application of MOLAR for the mCT required two algorithmic developments. First, in routine studies, the mCT collects list-mode data in 32 bit packets, where averaging of lines-of-response (LORs) by axial span and angular mashing reduced the number of LORs so that 32 bits are sufficient to address all sinogram bins. This degrades spatial resolution. In this work, we proposed a probabilistic LOR (pLOR) position technique that addresses axial and transaxial LOR grouping in 32 bit data. Second, two simplified approaches for 3D time-of-flight (TOF) scatter estimation were developed to accelerate the computationally intensive calculation without compromising accuracy. The proposed list-mode reconstruction algorithm was compared to the manufacturer's point spread function + TOF (PSF+TOF) algorithm. Phantom, animal, and human studies demonstrated that MOLAR with pLOR gives slightly faster contrast recovery than the PSF+TOF algorithm that uses the average 32 bit LOR sinogram positioning. Moving phantom and a whole-body human study suggested that event-by-event motion correction reduces image blurring caused by respiratory motion. We conclude that list-mode reconstruction with pLOR positioning provides a platform to generate high quality images for the mCT, and to recover fine structures in whole-body PET scans through event-by-event motion correction.

  8. List-mode Reconstruction for the Biograph mCT with Physics Modeling and Event-by-Event Motion Correction

    PubMed Central

    Jin, Xiao; Chan, Chung; Mulnix, Tim; Panin, Vladimir; Casey, Michael E.; Liu, Chi; Carson, Richard E.

    2013-01-01

    Whole-body PET/CT scanners are important clinical and research tools to study tracer distribution throughout the body. In whole-body studies, respiratory motion results in image artifacts. We have previously demonstrated for brain imaging that, when provided accurate motion data, event-by-event correction has better accuracy than frame-based methods. Therefore, the goal of this work was to develop a list-mode reconstruction with novel physics modeling for the Siemens Biograph mCT with event-by-event motion correction, based on the MOLAR platform (Motion-compensation OSEM List-mode Algorithm for Resolution-Recovery Reconstruction). Application of MOLAR for the mCT required two algorithmic developments. First, in routine studies, the mCT collects list-mode data in 32-bit packets, where averaging of lines of response (LORs) by axial span and angular mashing reduced the number of LORs so that 32 bits are sufficient to address all sinogram bins. This degrades spatial resolution. In this work, we proposed a probabilistic assignment of LOR positions (pLOR) that addresses axial and transaxial LOR grouping in 32-bit data. Second, two simplified approaches for 3D TOF scatter estimation were developed to accelerate the computationally intensive calculation without compromising accuracy. The proposed list-mode reconstruction algorithm was compared to the manufacturer's point spread function + time-of-flight (PSF+TOF) algorithm. Phantom, animal, and human studies demonstrated that MOLAR with pLOR gives slightly faster contrast recovery than the PSF+TOF algorithm that uses the average 32-bit LOR sinogram positioning. Moving phantom and a whole-body human study suggested that event-by-event motion correction reduces image blurring caused by respiratory motion. We conclude that list-mode reconstruction with pLOR positioning provides a platform to generate high quality images for the mCT, and to recover fine structures in whole-body PET scans through event-by-event motion correction. PMID:23892635

  9. Animal models for osteoporosis

    NASA Technical Reports Server (NTRS)

    Turner, R. T.; Maran, A.; Lotinun, S.; Hefferan, T.; Evans, G. L.; Zhang, M.; Sibonga, J. D.

    2001-01-01

    Animal models will continue to be important tools in the quest to understand the contribution of specific genes to establishment of peak bone mass and optimal bone architecture, as well as the genetic basis for a predisposition toward accelerated bone loss in the presence of co-morbidity factors such as estrogen deficiency. Existing animal models will continue to be useful for modeling changes in bone metabolism and architecture induced by well-defined local and systemic factors. However, there is a critical unfulfilled need to develop and validate better animal models to allow fruitful investigation of the interaction of the multitude of factors which precipitate senile osteoporosis. Well characterized and validated animal models that can be recommended for investigation of the etiology, prevention and treatment of several forms of osteoporosis have been listed in Table 1. Also listed are models which are provisionally recommended. These latter models have potential but are inadequately characterized, deviate significantly from the human response, require careful choice of strain or age, or are not practical for most investigators to adopt. It cannot be stressed strongly enough that the enormous potential of laboratory animals as models for osteoporosis can only be realized if great care is taken in the choice of an appropriate species, age, experimental design, and measurements. Poor choices will results in misinterpretation of results which ultimately can bring harm to patients who suffer from osteoporosis by delaying advancement of knowledge.

  10. OntoMate: a text-mining tool aiding curation at the Rat Genome Database

    PubMed Central

    Liu, Weisong; Laulederkind, Stanley J. F.; Hayman, G. Thomas; Wang, Shur-Jen; Nigam, Rajni; Smith, Jennifer R.; De Pons, Jeff; Dwinell, Melinda R.; Shimoyama, Mary

    2015-01-01

    The Rat Genome Database (RGD) is the premier repository of rat genomic, genetic and physiologic data. Converting data from free text in the scientific literature to a structured format is one of the main tasks of all model organism databases. RGD spends considerable effort manually curating gene, Quantitative Trait Locus (QTL) and strain information. The rapidly growing volume of biomedical literature and the active research in the biological natural language processing (bioNLP) community have given RGD the impetus to adopt text-mining tools to improve curation efficiency. Recently, RGD has initiated a project to use OntoMate, an ontology-driven, concept-based literature search engine developed at RGD, as a replacement for the PubMed (http://www.ncbi.nlm.nih.gov/pubmed) search engine in the gene curation workflow. OntoMate tags abstracts with gene names, gene mutations, organism name and most of the 16 ontologies/vocabularies used at RGD. All terms/ entities tagged to an abstract are listed with the abstract in the search results. All listed terms are linked both to data entry boxes and a term browser in the curation tool. OntoMate also provides user-activated filters for species, date and other parameters relevant to the literature search. Using the system for literature search and import has streamlined the process compared to using PubMed. The system was built with a scalable and open architecture, including features specifically designed to accelerate the RGD gene curation process. With the use of bioNLP tools, RGD has added more automation to its curation workflow. Database URL: http://rgd.mcw.edu PMID:25619558

  11. Developing a MATLAB(registered)-Based Tool for Visualization and Transformation

    NASA Technical Reports Server (NTRS)

    Anderton, Blake J.

    2003-01-01

    An important step in the structural design and development of spacecraft is the experimental identification of a structure s modal characteristics, such as its natural frequencies and modes of vibration. These characteristics are vital to developing a representative model of any given structure or analyzing the range of input frequencies that can be handled by a particular structure. When setting up such a representative model of a structure, careful measurements using precision equipment (such as accelerometers and instrumented hammers) must be made on many individual points of the structure in question. The coordinate location of each data point is used to construct a wireframe geometric model of the structure. Response measurements obtained from the accelerometers is used to generate the modal shapes of the particular structure. Graphically, this is displayed as a combination of the ways a structure will ideally respond to a specified force input. Two types of models of the tested structure are often used in modal analysis: an analytic model showing expected behavior of the structure, and an experimental model showing measured results due to observed phenomena. To evaluate the results from the experimental model, a comparison of analytic and experimental results must be made between the two models. However, comparisons between these two models become difficult when the two coordinate orientations differ in a manner such that results are displayed in an unclear fashion. Such a problem proposes the need for a tool that not only communicates a graphical image of a structure s wireframe geometry based on various measurement locations (called nodes), but also allows for a type of transformation of the image s coordinate geometry so that a model s coordinate orientation is made to match the orientation of another model. Such a tool should also be designed so that it is able to construct coordinate geometry based on many different listings of node locations and is able to transform the wireframe coordinate orientation to match almost any possible orientation (i.e. it should not be a problem specific application) if it is to be of much value in modal analysis. Also, since universal files are used to store modal parameters and wireframe geometry, the tool must be able to read and extract information from universal files and use these files to exchange model data.The purpose of this project is to develop such a tool as a computer graphical user interface (GUI) capable of performing the following tasks: 1) Browsing for a particular universal file within the computer directory and displaying the name of this file to the screen; 2) Plotting each of the nodes within the universal file in a useful, descriptive, and easily understood figure; 3) Reading the node numbers from the selected file and listing these node numbers to the user for selection in an easily accessible format; 4) Allowing for user selection of a new model orientation defined by three selected nodes; and 5) Allowing the user to specify a directory to which the transformed model s node locations will be saved, and saving the transformed node locations to the specified file.

  12. Knowledge-based compact disease models identify new molecular players contributing to early-stage Alzheimer’s disease

    PubMed Central

    2013-01-01

    Background High-throughput profiling of human tissues typically yield as results the gene lists comprised of a mix of relevant molecular entities with multiple false positives that obstruct the translation of such results into mechanistic hypotheses. From general probabilistic considerations, gene lists distilled for the mechanistically relevant components can be far more useful for subsequent experimental design or data interpretation. Results The input candidate gene lists were processed into different tiers of evidence consistency established by enrichment analysis across subsets of the same experiments and across different experiments and platforms. The cut-offs were established empirically through ontological and semantic enrichment; resultant shortened gene list was re-expanded by Ingenuity Pathway Assistant tool. The resulting sub-networks provided the basis for generating mechanistic hypotheses that were partially validated by literature search. This approach differs from previous consistency-based studies in that the cut-off on the Receiver Operating Characteristic of the true-false separation process is optimized by flexible selection of the consistency building procedure. The gene list distilled by this analytic technique and its network representation were termed Compact Disease Model (CDM). Here we present the CDM signature for the study of early-stage Alzheimer’s disease. The integrated analysis of this gene signature allowed us to identify the protein traffic vesicles as prominent players in the pathogenesis of Alzheimer’s. Considering the distances and complexity of protein trafficking in neurons, it is plausible that spontaneous protein misfolding along with a shortage of growth stimulation result in neurodegeneration. Several potentially overlapping scenarios of early-stage Alzheimer pathogenesis have been discussed, with an emphasis on the protective effects of AT-1 mediated antihypertensive response on cytoskeleton remodeling, along with neuronal activation of oncogenes, luteinizing hormone signaling and insulin-related growth regulation, forming a pleiotropic model of its early stages. Alignment with emerging literature confirmed many predictions derived from early-stage Alzheimer’s disease’ CDM. Conclusions A flexible approach for high-throughput data analysis, the Compact Disease Model generation, allows extraction of meaningful, mechanism-centered gene sets compatible with instant translation of the results into testable hypotheses. PMID:24196233

  13. Reviews of Instructional Software in Scholarly Journals: A Selected Bibliography.

    ERIC Educational Resources Information Center

    Bantz, David A.; And Others

    This bibliography lists reviews of more than 100 instructional software packages, which are arranged alphabetically by discipline. Information provided for each entry includes the topical emphasis, type of software (i.e., simulation, tutorial, analysis tool, test generator, database, writing tool, drill, plotting tool, videodisc), the journal…

  14. Bibliographic Projects and Tools in Israel.

    ERIC Educational Resources Information Center

    Kedar, Rochelle

    This paper presents several of the most prominent bibliographic tools and projects current in Israel, as well as a few specialized and less well-known projects. Bibliographic tools include the Israel Union Catalog and the Israel Union List of Serials. The following are the major bibliographic projects described: the National Jewish Bibliography…

  15. Special Issue: Very large eddy simulation. Issue Edited by Dimitris Drikakis.Copyright © 2002 John Wiley & Sons, Ltd.Save Title to My Profile

    E-MailPrint

    Volume 39, Issue 9, Pages 763-864(30 July 2002)

    Research Article

    Embedded turbulence model in numerical methods for hyperbolic conservation laws

    NASA Astrophysics Data System (ADS)

    Drikakis, D.

    2002-07-01

    The paper describes the use of numerical methods for hyperbolic conservation laws as an embedded turbulence modelling approach. Different Godunov-type schemes are utilized in computations of Burgers' turbulence and a two-dimensional mixing layer. The schemes include a total variation diminishing, characteristic-based scheme which is developed in this paper using the flux limiter approach. The embedded turbulence modelling property of the above methods is demonstrated through coarsely resolved large eddy simulations with and without subgrid scale models. Copyright

  16. Publishing and sharing of hydrologic models through WaterHUB

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.

    2011-12-01

    Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.

  17. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    . Development Tools View list of tools for build automation, version control, and high-level or specialized scripting. Toolchains Learn about the available toolchains to build applications from source code

  18. The use of artificial neural networks in experimental data acquisition and aerodynamic design

    NASA Technical Reports Server (NTRS)

    Meade, Andrew J., Jr.

    1991-01-01

    It is proposed that an artificial neural network be used to construct an intelligent data acquisition system. The artificial neural networks (ANN) model has a potential for replacing traditional procedures as well as for use in computational fluid dynamics validation. Potential advantages of the ANN model are listed. As a proof of concept, the author modeled a NACA 0012 airfoil at specific conditions, using the neural network simulator NETS, developed by James Baffes of the NASA Johnson Space Center. The neural network predictions were compared to the actual data. It is concluded that artificial neural networks can provide an elegant and valuable class of mathematical tools for data analysis.

  19. Supporting Red List threat assessments with GeoCAT: geospatial conservation assessment tool.

    PubMed

    Bachman, Steven; Moat, Justin; Hill, Andrew W; de Torre, Javier; Scott, Ben

    2011-01-01

    GeoCAT is an open source, browser based tool that performs rapid geospatial analysis to ease the process of Red Listing taxa. Developed to utilise spatially referenced primary occurrence data, the analysis focuses on two aspects of the geographic range of a taxon: the extent of occurrence (EOO) and the area of occupancy (AOO). These metrics form part of the IUCN Red List categories and criteria and have often proved challenging to obtain in an accurate, consistent and repeatable way. Within a familiar Google Maps environment, GeoCAT users can quickly and easily combine data from multiple sources such as GBIF, Flickr and Scratchpads as well as user generated occurrence data. Analysis is done with the click of a button and is visualised instantly, providing an indication of the Red List threat rating, subject to meeting the full requirements of the criteria. Outputs including the results, data and parameters used for analysis are stored in a GeoCAT file that can be easily reloaded or shared with collaborators. GeoCAT is a first step toward automating the data handling process of Red List assessing and provides a valuable hub from which further developments and enhancements can be spawned.

  20. MaRiMba: A Software Application for Spectral Library-Based MRM Transition List Assembly

    PubMed Central

    Sherwood, Carly A.; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K.; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W.; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B.

    2009-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture. PMID:19603829

  1. The Plant Genome Integrative Explorer Resource: PlantGenIE.org.

    PubMed

    Sundell, David; Mannapperuma, Chanaka; Netotea, Sergiu; Delhomme, Nicolas; Lin, Yao-Cheng; Sjödin, Andreas; Van de Peer, Yves; Jansson, Stefan; Hvidsten, Torgeir R; Street, Nathaniel R

    2015-12-01

    Accessing and exploring large-scale genomics data sets remains a significant challenge to researchers without specialist bioinformatics training. We present the integrated PlantGenIE.org platform for exploration of Populus, conifer and Arabidopsis genomics data, which includes expression networks and associated visualization tools. Standard features of a model organism database are provided, including genome browsers, gene list annotation, Blast homology searches and gene information pages. Community annotation updating is supported via integration of WebApollo. We have produced an RNA-sequencing (RNA-Seq) expression atlas for Populus tremula and have integrated these data within the expression tools. An updated version of the ComPlEx resource for performing comparative plant expression analyses of gene coexpression network conservation between species has also been integrated. The PlantGenIE.org platform provides intuitive access to large-scale and genome-wide genomics data from model forest tree species, facilitating both community contributions to annotation improvement and tools supporting use of the included data resources to inform biological insight. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  2. Adapting Web content for low-literacy readers by using lexical elaboration and named entities labeling

    NASA Astrophysics Data System (ADS)

    Watanabe, W. M.; Candido, A.; Amâncio, M. A.; De Oliveira, M.; Pardo, T. A. S.; Fortes, R. P. M.; Aluísio, S. M.

    2010-12-01

    This paper presents an approach for assisting low-literacy readers in accessing Web online information. The "Educational FACILITA" tool is a Web content adaptation tool that provides innovative features and follows more intuitive interaction models regarding accessibility concerns. Especially, we propose an interaction model and a Web application that explore the natural language processing tasks of lexical elaboration and named entity labeling for improving Web accessibility. We report on the results obtained from a pilot study on usability analysis carried out with low-literacy users. The preliminary results show that "Educational FACILITA" improves the comprehension of text elements, although the assistance mechanisms might also confuse users when word sense ambiguity is introduced, by gathering, for a complex word, a list of synonyms with multiple meanings. This fact evokes a future solution in which the correct sense for a complex word in a sentence is identified, solving this pervasive characteristic of natural languages. The pilot study also identified that experienced computer users find the tool to be more useful than novice computer users do.

  3. Evaluation of in silico tools to predict the skin sensitization potential of chemicals.

    PubMed

    Verheyen, G R; Braeken, E; Van Deun, K; Van Miert, S

    2017-01-01

    Public domain and commercial in silico tools were compared for their performance in predicting the skin sensitization potential of chemicals. The packages were either statistical based (Vega, CASE Ultra) or rule based (OECD Toolbox, Toxtree, Derek Nexus). In practice, several of these in silico tools are used in gap filling and read-across, but here their use was limited to make predictions based on presence/absence of structural features associated to sensitization. The top 400 ranking substances of the ATSDR 2011 Priority List of Hazardous Substances were selected as a starting point. Experimental information was identified for 160 chemically diverse substances (82 positive and 78 negative). The prediction for skin sensitization potential was compared with the experimental data. Rule-based tools perform slightly better, with accuracies ranging from 0.6 (OECD Toolbox) to 0.78 (Derek Nexus), compared with statistical tools that had accuracies ranging from 0.48 (Vega) to 0.73 (CASE Ultra - LLNA weak model). Combining models increased the performance, with positive and negative predictive values up to 80% and 84%, respectively. However, the number of substances that were predicted positive or negative for skin sensitization in both models was low. Adding more substances to the dataset will increase the confidence in the conclusions reached. The insights obtained in this evaluation are incorporated in a web database www.asopus.weebly.com that provides a potential end user context for the scope and performance of different in silico tools with respect to a common dataset of curated skin sensitization data.

  4. TLS from fundamentals to practice

    PubMed Central

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Adams, Paul D.

    2014-01-01

    The Translation-Libration-Screw-rotation (TLS) model of rigid-body harmonic displacements introduced in crystallography by Schomaker & Trueblood (1968) is now a routine tool in macromolecular studies and is a feature of most modern crystallographic structure refinement packages. In this review we consider a number of simple examples that illustrate important features of the TLS model. Based on these examples simplified formulae are given for several special cases that may occur in structure modeling and refinement. The derivation of general TLS formulae from basic principles is also provided. This manuscript describes the principles of TLS modeling, as well as some select algorithmic details for practical application. An extensive list of applications references as examples of TLS in macromolecular crystallography refinement is provided. PMID:25249713

  5. Multivariable regression analysis of list experiment data on abortion: results from a large, randomly-selected population based study in Liberia.

    PubMed

    Moseson, Heidi; Gerdts, Caitlin; Dehlendorf, Christine; Hiatt, Robert A; Vittinghoff, Eric

    2017-12-21

    The list experiment is a promising measurement tool for eliciting truthful responses to stigmatized or sensitive health behaviors. However, investigators may be hesitant to adopt the method due to previously untestable assumptions and the perceived inability to conduct multivariable analysis. With a recently developed statistical test that can detect the presence of a design effect - the absence of which is a central assumption of the list experiment method - we sought to test the validity of a list experiment conducted on self-reported abortion in Liberia. We also aim to introduce recently developed multivariable regression estimators for the analysis of list experiment data, to explore relationships between respondent characteristics and having had an abortion - an important component of understanding the experiences of women who have abortions. To test the null hypothesis of no design effect in the Liberian list experiment data, we calculated the percentage of each respondent "type," characterized by response to the control items, and compared these percentages across treatment and control groups with a Bonferroni-adjusted alpha criterion. We then implemented two least squares and two maximum likelihood models (four total), each representing different bias-variance trade-offs, to estimate the association between respondent characteristics and abortion. We find no clear evidence of a design effect in list experiment data from Liberia (p = 0.18), affirming the first key assumption of the method. Multivariable analyses suggest a negative association between education and history of abortion. The retrospective nature of measuring lifetime experience of abortion, however, complicates interpretation of results, as the timing and safety of a respondent's abortion may have influenced her ability to pursue an education. Our work demonstrates that multivariable analyses, as well as statistical testing of a key design assumption, are possible with list experiment data, although with important limitations when considering lifetime measures. We outline how to implement this methodology with list experiment data in future research.

  6. A Survey of Security Tools for the Industrial Control System Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurd, Carl M.; McCarty, Michael V.

    This report details the results of a survey conducted by Idaho National Laboratory (INL) to identify existing tools which could be used to prevent, detect, mitigate, or investigate a cyber-attack in an industrial control system (ICS) environment. This report compiles a list of potentially applicable tools and shows the coverage of the tools in an ICS architecture.

  7. Distributed File System Utilities to Manage Large DatasetsVersion 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-05-21

    FileUtils provides a suite of tools to manage large datasets typically created by large parallel MPI applications. They are written in C and use standard POSIX I/Ocalls. The current suite consists of tools to copy, compare, remove, and list. The tools provide dramatic speedup over existing Linux tools, which often run as a single process.

  8. Dynamics of list-server discussion on genetically modified foods.

    PubMed

    Triunfol, Marcia L; Hines, Pamela J

    2004-04-01

    Computer-mediated discussion lists, or list-servers, are popular tools in settings ranging from professional to personal to educational. A discussion list on genetically modified food (GMF) was created in September 2000 as part of the Forum on Genetically Modified Food developed by Science Controversies: Online Partnerships in Education (SCOPE), an educational project that uses computer resources to aid research and learning around unresolved scientific questions. The discussion list "GMF-Science" was actively supported from January 2001 to May 2002. The GMF-Science list welcomed anyone interested in discussing the controversies surrounding GMF. Here, we analyze the dynamics of the discussions and how the GMF-Science list may contribute to learning. Activity on the GMF-Science discussion list reflected some but not all the controversies that were appearing in more traditional publication formats, broached other topics not well represented in the published literature, and tended to leave undiscussed the more technical research developments.

  9. Pizza.py Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  10. Index of Workplace & Adult Basic Skills Software.

    ERIC Educational Resources Information Center

    Askov, Eunice N.; Clark, Cindy Jo

    This index of workplace and adult basic skills computer software includes 108 listings. Each listing is described according to the following classifications: (1) teacher/tutor tools (customizable or mini-authoring systems); (2) assessment and skills; (3) content; (4) instruction method; (5) system requirements; and (6) name, address, and phone…

  11. Food Marketing: Cashier-Checker. Teacher's Guide. Competency Based Curriculum.

    ERIC Educational Resources Information Center

    Froelich, Larry; And Others

    This teacher's guide is designed to accompany the Competency Based Cashier-Checker Curriculum student materials--see note. Contents include a listing of reference materials, tool and equipment lists, copy of the table of contents for student competency sheets, teacher's suggestions, and answer keys for information sheets and exercises.…

  12. WVR-EMAP A SMALL WATERSHED CHARACTERIZATION, CLASSIFICATION, AND ASSESSMENT FOR WEST VIRGINIA UTILIZING EMAP DESIGN AND TOOLS

    EPA Science Inventory

    Nationwide, there is a strong need to streamline methods for assessing impairment of surface waters (305b listings), diagnosing cause of biological impairment (303d listings), estimating total maximum daily loads (TMDLs), and/or prioritizing watershed restoration activities (Unif...

  13. Government Publications; a Guide to Bibliographic Tools. Fourth Edition.

    ERIC Educational Resources Information Center

    Palic, Vladimir M.

    Current and retrospective bibliographic aids are listed for official publications issued by the United States, foreign countries, and international governmental organizations. The material is arranged by geographic area, with U.S. federal, state, and local government publications listed separately. A short history of each U.S. government agency is…

  14. Food Production, Management, and Services. Baking. Teacher Edition. Second Edition.

    ERIC Educational Resources Information Center

    Gibson, LeRoy

    These instructional materials are intended for a course on food production, management, and services involved in baking. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; 13 references; and a…

  15. Nucleic Acid Database (NDB)

    Science.gov Websites

    the NDB archive or in the Non-Redundant list Advanced Search Search for structures based on structural features, chemical features, binding modes, citation and experimental information Featured Tools RNA 3D Motif Atlas, a representative collection of RNA 3D internal and hairpin loop motifs Non-redundant Lists

  16. 78 FR 20236 - Self-Regulation of Class II Gaming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-04

    ... commenters also submitted that this requirement is redundant, because tribal internal control systems (TICS... tribe should have readily available a list of internal gaming controls, which is a useful tool in.... 518.4(c)(vii), which requires petitioning tribes to submit a list of internal controls used at the...

  17. Task Lists for Industrial Occupations. Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Dimmlich, David

    These cluster matrices provide duties and tasks that form the basis of instructional content for secondary, postsecondary, and adult occupational training programs for industrial occupations. Duties and skills are presented for the following: (1) electric home appliance and power tool repairers; (2) office machine/cash register repairer; (3)…

  18. In Sync with Science Teaching

    ERIC Educational Resources Information Center

    Scribner-MacLean, Michelle; Nikonchuk, Andrew; Kaplo, Patrick; Wall, Michael

    2006-01-01

    Science educators are often among the first to use emerging technologies in the classroom and laboratory. For the technologically savvy science teacher, the handheld computer is a terrific tool. A handheld computer is a portable electronic device that helps organize (via calendars, contact lists, to-do lists) and integrate electronic data…

  19. Build Your Own Solar Air Heater.

    ERIC Educational Resources Information Center

    Conservation and Renewable Energy Inquiry and Referral Service (DOE), Silver Spring, MD.

    The solar air heater is a simple device for catching some of the sun's energy to heat a home. Procedures for making and installing such a heater are presented. Included is a materials list, including tools needed for constructing the heater, sources for obtaining further details, and a list of material specifications. (JN)

  20. ASSIST user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1995-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.

  1. Systematic identification of high crash locations

    DOT National Transportation Integrated Search

    2001-05-01

    The objective of this project is to develop tools and procedures by which Iowa engineers can identify potentially hazardous roadway locations and designs, and to demonstrate the utility of these tools by developing candidate lists of high crash locat...

  2. Learning From Experience: Qualitative Analysis to Develop a Cognitive Task List for Kielland Forceps Deliveries.

    PubMed

    Simpson, Andrea N; Hodges, Ryan; Snelgrove, John; Gurau, David; Secter, Michael; Mocarski, Eva; Pittini, Richard; Windrim, Rory; Higgins, Mary

    2015-05-01

    Fetal malposition is a common indication for Caesarean section in the second stage of labour. Rotational (Kielland) forceps are a valuable tool in select situations for successful vaginal delivery; however, learning opportunities are scarce. Our aim was to identify the verbal and non-verbal components of performing a safe Kielland forceps delivery through filmed demonstrations by expert practitioners on models to develop a task list for training purposes. Labour and delivery nurses at three university-affiliated hospitals identified clinicians whom they considered skilled in Kielland forceps deliveries. These physicians gave consent and were filmed performing Kielland forceps deliveries on a model, describing their assessment and technique and sharing clinical pearls based on their experience. Two clinicians reviewed the videos independently and recorded verbal and non-verbal components of the assessment; thematic analysis was performed and a core task list was developed. The algorithm was circulated to participants to ensure consensus. Eleven clinicians were identified; eight participated. Common themes were prevention of persistent malposition where possible, a thorough assessment to determine suitability for forceps delivery, roles of the multidisciplinary team, description of the Kielland forceps and technical aspects related to their use, the importance of communication with the parents and the team (including consent, debriefing, and documentation), and "red flags" that indicate the need to stop when safety criteria cannot be met. Development of a cognitive task list, derived from years of experience with Kielland forceps deliveries by expert clinicians, provides an inclusive algorithm that may facilitate standardized resident training to enhance education in rotational forceps deliveries.

  3. Sorting protein lists with nwCompare: a simple and fast algorithm for n-way comparison of proteomic data files.

    PubMed

    Pont, Frédéric; Fournié, Jean Jacques

    2010-03-01

    MS, the reference technology for proteomics, routinely produces large numbers of protein lists whose fast comparison would prove very useful. Unfortunately, most softwares only allow comparisons of two to three lists at once. We introduce here nwCompare, a simple tool for n-way comparison of several protein lists without any query language, and exemplify its use with differential and shared cancer cell proteomes. As the software compares character strings, it can be applied to any type of data mining, such as genomic or metabolomic datalists.

  4. Tailoring periodical collections to meet institutional needs.

    PubMed Central

    Delman, B S

    1984-01-01

    A system for tailoring journal collections to meet institutional needs is described. The approach is based on the view that reference work and collection development are variant and complementary forms of the same library function; both tasks have as their objective a literature response to information problems. Utilizing the tools and procedures of the reference search in response to a specific collection development problem topic, the author created a model ranked list of relevant journals. Finally, by linking the model to certain operational and environmental factors in three different health care organizations, he tailored the collection to meet the institutions' respective information needs. PMID:6375775

  5. Clean Cities Tools: Tools to Help You Save Money, Use Less Petroleum, and Reduce Emissions (Brochure)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    Clean Cities Alternative Fuels and Advanced Vehicles Data Center (AFDC) features a wide range of Web-based tools to help vehicle fleets and individual consumers reduce their petroleum use. This brochure lists and describes Clean Cities online tools related to vehicles, alternative fueling stations, electric vehicle charging stations, fuel conservation, emissions reduction, fuel economy, and more.

  6. Tissue tropisms, infection kinetics, histologic lesions, and antibody response of the MR766 strain of Zika virus in a murine model.

    PubMed

    Kawiecki, Anna B; Mayton, E Handly; Dutuze, M Fausta; Goupil, Brad A; Langohr, Ingeborg M; Del Piero, Fabio; Christofferson, Rebecca C

    2017-04-18

    The appearance of severe Zika virus (ZIKV) disease in the most recent outbreak has prompted researchers to respond through the development of tools to quickly characterize transmission and pathology. We describe here another such tool, a mouse model of ZIKV infection and pathogenesis using the MR766 strain of virus that adds to the growing body of knowledge regarding ZIKV kinetics in small animal models. We infected mice with the MR766 strain of ZIKV to determine infection kinetics via serum viremia. We further evaluated infection-induced lesions via histopathology and visualized viral antigen via immunohistochemical labeling. We also investigated the antibody response of recovered animals to both the MR766 and a strain from the current outbreak (PRVABC59). We demonstrate that the IRF3/7 DKO mouse is a susceptible, mostly non-lethal model well suited for the study of infection kinetics, pathological progression, and antibody response. Infected mice presented lesions in tissues that have been associated with ZIKV infection in the human population, such as the eyes, male gonads, and central nervous system. In addition, we demonstrate that infection with the MR766 strain produces cross-neutralizing antibodies to the PRVABC59 strain of the Asian lineage. This model provides an additional tool for future studies into the transmission routes of ZIKV, as well as for the development of antivirals and other therapeutics, and should be included in the growing list of available tools for investigations of ZIKV infection and pathogenesis.

  7. Using multiple lines of evidence to assess the risk of ecosystem collapse

    PubMed Central

    Regan, Tracey J.; Dinh, Minh Ngoc; Ferrari, Renata; Keith, David A.; Lester, Rebecca; Mouillot, David; Murray, Nicholas J.; Nguyen, Hoang Anh; Nicholson, Emily

    2017-01-01

    Effective ecosystem risk assessment relies on a conceptual understanding of ecosystem dynamics and the synthesis of multiple lines of evidence. Risk assessment protocols and ecosystem models integrate limited observational data with threat scenarios, making them valuable tools for monitoring ecosystem status and diagnosing key mechanisms of decline to be addressed by management. We applied the IUCN Red List of Ecosystems criteria to quantify the risk of collapse of the Meso-American Reef, a unique ecosystem containing the second longest barrier reef in the world. We collated a wide array of empirical data (field and remotely sensed), and used a stochastic ecosystem model to backcast past ecosystem dynamics, as well as forecast future ecosystem dynamics under 11 scenarios of threat. The ecosystem is at high risk from mass bleaching in the coming decades, with compounding effects of ocean acidification, hurricanes, pollution and fishing. The overall status of the ecosystem is Critically Endangered (plausibly Vulnerable to Critically Endangered), with notable differences among Red List criteria and data types in detecting the most severe symptoms of risk. Our case study provides a template for assessing risks to coral reefs and for further application of ecosystem models in risk assessment. PMID:28931744

  8. Using multiple lines of evidence to assess the risk of ecosystem collapse.

    PubMed

    Bland, Lucie M; Regan, Tracey J; Dinh, Minh Ngoc; Ferrari, Renata; Keith, David A; Lester, Rebecca; Mouillot, David; Murray, Nicholas J; Nguyen, Hoang Anh; Nicholson, Emily

    2017-09-27

    Effective ecosystem risk assessment relies on a conceptual understanding of ecosystem dynamics and the synthesis of multiple lines of evidence. Risk assessment protocols and ecosystem models integrate limited observational data with threat scenarios, making them valuable tools for monitoring ecosystem status and diagnosing key mechanisms of decline to be addressed by management. We applied the IUCN Red List of Ecosystems criteria to quantify the risk of collapse of the Meso-American Reef, a unique ecosystem containing the second longest barrier reef in the world. We collated a wide array of empirical data (field and remotely sensed), and used a stochastic ecosystem model to backcast past ecosystem dynamics, as well as forecast future ecosystem dynamics under 11 scenarios of threat. The ecosystem is at high risk from mass bleaching in the coming decades, with compounding effects of ocean acidification, hurricanes, pollution and fishing. The overall status of the ecosystem is Critically Endangered (plausibly Vulnerable to Critically Endangered), with notable differences among Red List criteria and data types in detecting the most severe symptoms of risk. Our case study provides a template for assessing risks to coral reefs and for further application of ecosystem models in risk assessment. © 2017 The Authors.

  9. Detecting and accounting for multiple sources of positional variance in peak list registration analysis and spin system grouping.

    PubMed

    Smelter, Andrey; Rouchka, Eric C; Moseley, Hunter N B

    2017-08-01

    Peak lists derived from nuclear magnetic resonance (NMR) spectra are commonly used as input data for a variety of computer assisted and automated analyses. These include automated protein resonance assignment and protein structure calculation software tools. Prior to these analyses, peak lists must be aligned to each other and sets of related peaks must be grouped based on common chemical shift dimensions. Even when programs can perform peak grouping, they require the user to provide uniform match tolerances or use default values. However, peak grouping is further complicated by multiple sources of variance in peak position limiting the effectiveness of grouping methods that utilize uniform match tolerances. In addition, no method currently exists for deriving peak positional variances from single peak lists for grouping peaks into spin systems, i.e. spin system grouping within a single peak list. Therefore, we developed a complementary pair of peak list registration analysis and spin system grouping algorithms designed to overcome these limitations. We have implemented these algorithms into an approach that can identify multiple dimension-specific positional variances that exist in a single peak list and group peaks from a single peak list into spin systems. The resulting software tools generate a variety of useful statistics on both a single peak list and pairwise peak list alignment, especially for quality assessment of peak list datasets. We used a range of low and high quality experimental solution NMR and solid-state NMR peak lists to assess performance of our registration analysis and grouping algorithms. Analyses show that an algorithm using a single iteration and uniform match tolerances approach is only able to recover from 50 to 80% of the spin systems due to the presence of multiple sources of variance. Our algorithm recovers additional spin systems by reevaluating match tolerances in multiple iterations. To facilitate evaluation of the algorithms, we developed a peak list simulator within our nmrstarlib package that generates user-defined assigned peak lists from a given BMRB entry or database of entries. In addition, over 100,000 simulated peak lists with one or two sources of variance were generated to evaluate the performance and robustness of these new registration analysis and peak grouping algorithms.

  10. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  11. Gas Metal Arc Welding and Flux-Cored Arc Welding. Teacher Edition. Second Edition.

    ERIC Educational Resources Information Center

    Fortney, Clarence; Gregory, Mike

    These instructional materials are designed to improve instruction in Gas Metal Arc Welding (GMAW) and Flux-Cored Arc Welding (FCAW). The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and…

  12. Health Occupations Education II. Instructor's Manual.

    ERIC Educational Resources Information Center

    Day, Nancy; And Others

    This instructor's manual accompanies the 46 modules in Health Occupations Education II, the second course of a two-year course of study. Contents include a list of the modules and the performance skills covered in each module, a listing of tools and supplies required for learning activities in the modules cited by module title, an instructional…

  13. The National Association of College Stores College Store Personnel Placement Service Is a Valuable Tool for Members.

    ERIC Educational Resources Information Center

    College Store Journal, 1980

    1980-01-01

    The NACS Placement Service, which consists of two activities--resume referral and advertisements in a special "positions wanted/available" supplement sheet distributed with the weekly NACS College Stores Confidential Bulletin--is described. The position available listing, position wanted listing, and the resume are discussed. (MLW)

  14. IBM Applications and Techniques of Operations Research. A Selected Bibliography.

    ERIC Educational Resources Information Center

    International Business Machines Corp., White Plains, NY. Data Processing Div.

    This bibliography on the tools and applications of operations research, management science, industrial engineering, and systems engineering lists many entries which appeared between 1961 and 1966 in 186 periodicals and trade journals. Twenty-six texts in operations research are also listed along with an indication as to which of 37 techniques or…

  15. Summer Reading Lists: Research and Recommendations

    ERIC Educational Resources Information Center

    Lindley, Sarah; Giles, Rebecca M.; Tunks, Karyn

    2016-01-01

    Decades of research have focused on the impact of summer learning loss and effective tools in stemming the flow of knowledge lost during summer break. While reading lists have become a standard practice for addressing students' needs to maintain learning levels over the summer months, very little research has been conducted on the book lists…

  16. Selected Research Tools in Economics, Labor and Industrial Relations.

    ERIC Educational Resources Information Center

    Kaye, Ronald J.

    Twenty-two indexing and abstracting services and general reference sources in the areas of labor and industrial relations are listed in this selective bibliography for users of State University of New York at Albany Libraries. Classification numbers are included for each source and most have annotations. Materials are listed under four…

  17. U.S. Military Operations Within the Electromagnetic Spectrum: Operational Critical Weakness

    DTIC Science & Technology

    2008-04-23

    the mistake only after we landed.”27 The primary tool used to coordinate friendly use of the spectrum with ES and EA is the Joint Restricted Frequency ... List (JRFL). Frequencies that are deemed “necessary for friendly forces to accomplish objectives”28 are listed and classified as guarded, protected

  18. Development of Milestone Schedules for Selected Logistics Support Directorate Programs. Appendix A. Part 2. Task Summaries.

    DTIC Science & Technology

    1987-09-15

    MAC; CODE NUMBER: NONE AND REPAIR PARTS AND SPECIAL TOOLS LIST (RPSTL). RESPONSIBILITY: ROY & ILS DURATION: 32.00 WORK DAYS PRE PPPL SCHEDULE...ILS DURATION: 22.00 WORK DAYS R/V PPPL SCHEDULE: DVPMARPS REVIEW AND VALIDATE PRELIMINARY PROVISIONING PARTS LIST. CODE NUMBER: NONE RESPONSIBILITY

  19. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user-created plugins that add new functionality.This work was supported in part by HST AR #13919, HST GO #14268, and HST AR #14560.

  20. Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individualmore » component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.« less

  1. The Multi-Intelligence Tools Suite - Supporting Research and Development in Information and Knowledge Exploitation

    DTIC Science & Technology

    2011-06-01

    to build a membership fact. The atom definition also defines the precise order of the pieces. Each argument has a label (D) and a type ( E ). The...list of ato argument). Figure 2 shows the inference rule editor. B. Name E . Rule Premises F. Rule Conclusions Figure 2. Inference rule editor One...created using this specific rule. one premise in the rule premises list ( E ), which represents a list of fact conditions that need to be found in the fact

  2. School Turnaround Leaders: Selection Toolkit. Part of the School Turnaround Collection from Public Impact

    ERIC Educational Resources Information Center

    Public Impact, 2008

    2008-01-01

    This toolkit includes the following separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides a list of competencies that would…

  3. Identification of Threshold Concepts for Biochemistry

    PubMed Central

    Green, David; Lewis, Jennifer E.; Lin, Sara; Minderhout, Vicky

    2014-01-01

    Threshold concepts (TCs) are concepts that, when mastered, represent a transformed understanding of a discipline without which the learner cannot progress. We have undertaken a process involving more than 75 faculty members and 50 undergraduate students to identify a working list of TCs for biochemistry. The process of identifying TCs for biochemistry was modeled on extensive work related to TCs across a range of disciplines and included faculty workshops and student interviews. Using an iterative process, we prioritized five concepts on which to focus future development of instructional materials. Broadly defined, the concepts are steady state, biochemical pathway dynamics and regulation, the physical basis of interactions, thermodynamics of macromolecular structure formation, and free energy. The working list presented here is not intended to be exhaustive, but rather is meant to identify a subset of TCs for biochemistry for which instructional and assessment tools for undergraduate biochemistry will be developed. PMID:25185234

  4. BitPredator: A Discovery Algorithm for BitTorrent Initial Seeders and Peers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borges, Raymond; Patton, Robert M; Kettani, Houssain

    2011-01-01

    There is a large amount of illegal content being replicated through peer-to-peer (P2P) networks where BitTorrent is dominant; therefore, a framework to profile and police it is needed. The goal of this work is to explore the behavior of initial seeds and highly active peers to develop techniques to correctly identify them. We intend to establish a new methodology and software framework for profiling BitTorrent peers. This involves three steps: crawling torrent indexers for keywords in recently added torrents using Really Simple Syndication protocol (RSS), querying torrent trackers for peer list data and verifying Internet Protocol (IP) addresses from peermore » lists. We verify IPs using active monitoring methods. Peer behavior is evaluated and modeled using bitfield message responses. We also design a tool to profile worldwide file distribution by mapping IP-to-geolocation and linking to WHOIS server information in Google Earth.« less

  5. Causal criteria and counterfactuals; nothing more (or less) than scientific common sense.

    PubMed

    Phillips, Carl V; Goodman, Karen J

    2006-05-26

    Two persistent myths in epidemiology are that we can use a list of "causal criteria" to provide an algorithmic approach to inferring causation and that a modern "counterfactual model" can assist in the same endeavor. We argue that these are neither criteria nor a model, but that lists of causal considerations and formalizations of the counterfactual definition of causation are nevertheless useful tools for promoting scientific thinking. They set us on the path to the common sense of scientific inquiry, including testing hypotheses (really putting them to a test, not just calculating simplistic statistics), responding to the Duhem-Quine problem, and avoiding many common errors. Austin Bradford Hill's famous considerations are thus both over-interpreted by those who would use them as criteria and under-appreciated by those who dismiss them as flawed. Similarly, formalizations of counterfactuals are under-appreciated as lessons in basic scientific thinking. The need for lessons in scientific common sense is great in epidemiology, which is taught largely as an engineering discipline and practiced largely as technical tasks, making attention to core principles of scientific inquiry woefully rare.

  6. A human rights approach to the WHO Model List of Essential Medicines.

    PubMed Central

    Seuba, Xavier

    2006-01-01

    Since the first WHO Model List of Essential Medicines was adopted in 1977, it has become a popular tool among health professionals and Member States. WHO's joint effort with the United Nations Committee on Economic, Social and Cultural Rights has resulted in the inclusion of access to essential medicines in the core content of the right to health. The Committee states that the right to health contains a series of elements, such as availability, accessibility, acceptability and quality of health goods, services and programmes, which are in line with the WHO statement that essential medicines are intended to be available within the context of health systems in adequate amounts at all times, in the appropriate dosage forms, with assured quality and information, and at a price that the individual and the community can afford. The author considers another perspective by looking at the obligations to respect, protect and fulfil the right to health undertaken by the states adhering to the International Covenant of Economic, Social and Cultural Rights (ICESCR) and explores the relationship between access to medicines, the protection of intellectual property, and human rights. PMID:16710552

  7. Sea-level rise modeling handbook: Resource guide for coastal land managers, engineers, and scientists

    USGS Publications Warehouse

    Doyle, Thomas W.; Chivoiu, Bogdan; Enwright, Nicholas M.

    2015-08-24

    Global sea level is rising and may accelerate with continued fossil fuel consumption from industrial and population growth. In 2012, the U.S. Geological Survey conducted more than 30 training and feedback sessions with Federal, State, and nongovernmental organization (NGO) coastal managers and planners across the northern Gulf of Mexico coast to evaluate user needs, potential benefits, current scientific understanding, and utilization of resource aids and modeling tools focused on sea-level rise. In response to the findings from the sessions, this sea-level rise modeling handbook has been designed as a guide to the science and simulation models for understanding the dynamics and impacts of sea-level rise on coastal ecosystems. The review herein of decision-support tools and predictive models was compiled from the training sessions, from online research, and from publications. The purpose of this guide is to describe and categorize the suite of data, methods, and models and their design, structure, and application for hindcasting and forecasting the potential impacts of sea-level rise in coastal ecosystems. The data and models cover a broad spectrum of disciplines involving different designs and scales of spatial and temporal complexity for predicting environmental change and ecosystem response. These data and models have not heretofore been synthesized, nor have appraisals been made of their utility or limitations. Some models are demonstration tools for non-experts, whereas others require more expert capacity to apply for any given park, refuge, or regional application. A simplified tabular context has been developed to list and contrast a host of decision-support tools and models from the ecological, geological, and hydrological perspectives. Criteria were established to distinguish the source, scale, and quality of information input and geographic datasets; physical and biological constraints and relations; datum characteristics of water and land components; utility options for setting sea-level rise and climate change scenarios; and ease or difficulty of storing, displaying, or interpreting model output. Coastal land managers, engineers, and scientists can benefit from this synthesis of tools and models that have been developed for projecting causes and consequences of sea-level change on the landscape and seascape.

  8. Desiderata for Healthcare Integrated Data Repositories Based on Architectural Comparison of Three Public Repositories

    PubMed Central

    Huser, Vojtech; Cimino, James J.

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network’s Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management. PMID:24551366

  9. Desiderata for healthcare integrated data repositories based on architectural comparison of three public repositories.

    PubMed

    Huser, Vojtech; Cimino, James J

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.

  10. Seeking unique and common biological themes in multiple gene lists or datasets: pathway pattern extraction pipeline for pathway-level comparative analysis.

    PubMed

    Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M

    2009-06-29

    One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  11. Development of a VOR/DME model for an advanced concepts simulator

    NASA Technical Reports Server (NTRS)

    Steinmetz, G. G.; Bowles, R. L.

    1984-01-01

    The report presents a definition of a VOR/DME, airborne and ground systems simulation model. This description was drafted in response to a need in the creation of an advanced concepts simulation in which flight station design for the 1980 era can be postulated and examined. The simulation model described herein provides a reasonable representation of VOR/DME station in the continental United States including area coverage by type and noise errors. The detail in which the model has been cast provides the interested researcher with a moderate fidelity level simulator tool for conducting research and evaluation of navigator algorithms. Assumptions made within the development are listed and place certain responsibilities (data bases, communication with other simulation modules, uniform round earth, etc.) upon the researcher.

  12. Systems biology of embryonic development: Prospects for a complete understanding of the Caenorhabditis elegans embryo.

    PubMed

    Murray, John Isaac

    2018-05-01

    The convergence of developmental biology and modern genomics tools brings the potential for a comprehensive understanding of developmental systems. This is especially true for the Caenorhabditis elegans embryo because its small size, invariant developmental lineage, and powerful genetic and genomic tools provide the prospect of a cellular resolution understanding of messenger RNA (mRNA) expression and regulation across the organism. We describe here how a systems biology framework might allow large-scale determination of the embryonic regulatory relationships encoded in the C. elegans genome. This framework consists of two broad steps: (a) defining the "parts list"-all genes expressed in all cells at each time during development and (b) iterative steps of computational modeling and refinement of these models by experimental perturbation. Substantial progress has been made towards defining the parts list through imaging methods such as large-scale green fluorescent protein (GFP) reporter analysis. Imaging results are now being augmented by high-resolution transcriptome methods such as single-cell RNA sequencing, and it is likely the complete expression patterns of all genes across the embryo will be known within the next few years. In contrast, the modeling and perturbation experiments performed so far have focused largely on individual cell types or genes, and improved methods will be needed to expand them to the full genome and organism. This emerging comprehensive map of embryonic expression and regulatory function will provide a powerful resource for developmental biologists, and would also allow scientists to ask questions not accessible without a comprehensive picture. This article is categorized under: Invertebrate Organogenesis > Worms Technologies > Analysis of the Transcriptome Gene Expression and Transcriptional Hierarchies > Gene Networks and Genomics. © 2018 Wiley Periodicals, Inc.

  13. Multicenter Validation of a Customizable Scoring Tool for Selection of Trainees for a Residency or Fellowship Program. The EAST-IST Study.

    PubMed

    Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D

    2017-04-01

    Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.

  14. Development of materials for the rapid manufacture of die cast tooling

    NASA Astrophysics Data System (ADS)

    Hardro, Peter Jason

    The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.

  15. GOMA: functional enrichment analysis tool based on GO modules

    PubMed Central

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  16. Submarine pipeline on-bottom stability. Volume 2: Software and manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less

  17. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  18. International Bibliography of Computer-Assisted Terminology.

    ERIC Educational Resources Information Center

    Krommer-Benz, Magdalena, Comp.

    Because of the need for adequate reference tools in the new area of data banks for the field of terminology (there are currently 25 term banks in existence and more are planned), this bibliography lists some 350 references selected from a large number of both primary and secondary sources. It includes some entries selected from a list of…

  19. Canada’s Patented Medicines (Notice of Compliance) Proceedings and Intellectual Property

    PubMed Central

    Bian, Henry; McCourt, Conor

    2015-01-01

    Canada’s Patent Register is a tool created by the Patented Medicines (Notice of Compliance) Regulations to help innovators protect their inventions relating to pharmaceuticals. This tool exists at the intersection between the intellectual property and drug approval regimes. By listing a patent on the Patent Register, an innovator can prevent a generic manufacturer from entering the marketplace rather than having to wait for his or her patent to be infringed. This article provides information on the requirements for listing a patent on the Patent Register and an overview of how the Patent Medicines (Notice of Compliance) Regulations affect the drug approval process. PMID:25573772

  20. TOTAL user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1994-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in the model of a complex system can be devastatingly tedious and error-prone. Even with tools such as the Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST), the user must describe a system by specifying the rules governing the behavior of the system in order to generate the model. With the Table Oriented Translator to the ASSIST Language (TOTAL), the user can specify the components of a typical system and their attributes in the form of a table. The conditions that lead to system failure are also listed in a tabular form. The user can also abstractly specify dependencies with causes and effects. The level of information required is appropriate for system designers with little or no background in the details of reliability calculations. A menu-driven interface guides the user through the system description process, and the program updates the tables as new information is entered. The TOTAL program automatically generates an ASSIST input description to match the system description.

  1. Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.

    2002-01-01

    Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.

  2. Identifying Core Competencies of Infection Control Nurse Specialists in Hong Kong.

    PubMed

    Chan, Wai Fong; Bond, Trevor G; Adamson, Bob; Chow, Meyrick

    2016-01-01

    To confirm a core competency scale for Hong Kong infection control nurses at the advanced nursing practice level from the core competency items proposed in a previous phase of this study. This would serve as the foundation of competency assurance in Hong Kong hospitals. A cross-sectional survey design was used. All public and private hospitals in Hong Kong. All infection control nurses in hospitals of Hong Kong. The 83-item proposed core competency list established in an earlier study was transformed into a questionnaire and sent to 112 infection control nurses in 48 hospitals in Hong Kong. They were asked to rate the importance of each infection prevention and control item using Likert-style response categories. Data were analyzed using the Rasch model. The response rate of 81.25% was achieved. Seven items were removed from the proposed core competency list, leaving a scale of 76 items that fit the measurement requirements of the unidimensional Rasch model. Essential core competency items of advanced practice for infection control nurses in Hong Kong were identified based on the measurement criteria of the Rasch model. Several items of the scale that reflect local Hong Kong contextual characteristics are distinguished from the overseas standards. This local-specific competency list could serve as the foundation for education and for certification of infection control nurse specialists in Hong Kong. Rasch measurement is an appropriate analytical tool for identifying core competencies of advanced practice nurses in other specialties and in other locations in a manner that incorporates practitioner judgment and expertise.

  3. The Distress Thermometer for screening for severe fatigue in newly diagnosed breast and colorectal cancer patients.

    PubMed

    Abrahams, H J G; Gielissen, M F M; de Lugt, M; Kleijer, E F W; de Roos, W K; Balk, E; Verhagen, C A H H V M; Knoop, H

    2017-05-01

    Internationally, the Distress Thermometer and associated Problem List are increasingly used in oncology as screening tools for psychological distress. Cancer-related fatigue is common but often overlooked in clinical practice. We examined if severe fatigue in cancer patients can be identified with the fatigue item of the Problem List. Newly diagnosed breast (N = 334) and colorectal (N = 179) cancer patients were screened for severe fatigue, which was defined as having a positive score on the fatigue item of the Problem List. The Fatigue Severity subscale of the Checklist Individual Strength was used as gold standard measure for severe fatigue. In total, 78% of breast cancer patients and 81% of colorectal cancer patients were correctly identified with the fatigue item. The sensitivity was 89% in breast cancer patients and 91% in colorectal cancer patients. The specificity was 75% in breast cancer patients and 77% in colorectal cancer patients. The positive predictive value was 53% in breast cancer patients and 64% in colorectal cancer patients, whereas the negative predictive value was 95% in both tumor types. The fatigue item of the Problem List performs satisfactorily as a quick screening tool for severe fatigue. However, a positive screen should be followed up with a more thorough assessment of fatigue, ie, a questionnaire with a validated cutoff point. Given time pressure of clinicians, this already implemented and brief screening tool may prevent severe fatigue from going undetected in clinical practice. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Software to Compare NPP HDF5 Data Files

    NASA Technical Reports Server (NTRS)

    Wiegand, Chiu P.; LeMoigne-Stewart, Jacqueline; Ruley, LaMont T.

    2013-01-01

    This software was developed for the NPOESS (National Polar-orbiting Operational Environmental Satellite System) Preparatory Project (NPP) Science Data Segment. The purpose of this software is to compare HDF5 (Hierarchical Data Format) files specific to NPP and report whether the HDF5 files are identical. If the HDF5 files are different, users have the option of printing out the list of differences in the HDF5 data files. The user provides paths to two directories containing a list of HDF5 files to compare. The tool would select matching HDF5 file names from the two directories and run the comparison on each file. The user can also select from three levels of detail. Level 0 is the basic level, which simply states whether the files match or not. Level 1 is the intermediate level, which lists the differences between the files. Level 2 lists all the details regarding the comparison, such as which objects were compared, and how and where they are different. The HDF5 tool is written specifically for the NPP project. As such, it ignores certain attributes (such as creation_date, creation_ time, etc.) in the HDF5 files. This is because even though two HDF5 files could represent exactly the same granule, if they are created at different times, the creation date and time would be different. This tool is smart enough to ignore differences that are not relevant to NPP users.

  5. A Beginner's Sequence of Programming Activities.

    ERIC Educational Resources Information Center

    Slesnick, Twila

    1984-01-01

    Presents various programing activities using the BASIC and LOGO programing languages. Activities are included in separate sections with a title indicating the nature of the activities and the "tools" (commands) needed. For example, "Old-fashioned drawing" requires several tools (PRINT, LIST, RUN, GOTO) to make drawings using…

  6. WHAT DEGRADED THIS STREAM? TOOLS TO DETERMINE THE CAUSES OF ECOLOGICAL IMPAIRMENT

    EPA Science Inventory

    The identification of causes of impairment for waterbodies listed as biologically impaired is required as part of many federal, state and tribal regulations. The Office of Research and Development is developing a suite of tools that facilitates the identification and characteriz...

  7. Report: The EPA Should Assess the Utility of the Watch List as a Management Tool

    EPA Pesticide Factsheets

    Report #13-P-0435, September 30, 2013 . The agency runs the risk of maintaining a management tool that does not assist in tracking facilities with long-standing significant violations and has limited transparency and utility to the public.

  8. Small Business Management Training Tools Directory.

    ERIC Educational Resources Information Center

    American Association of Community and Junior Colleges, Washington, DC. National Small Business Training Network.

    This directory is designed to assist in the identification of supplementary materials to support program development for small businesses. Following introductory comments and an overview of small business management training, section I lists training tools available from the Small Business Administration (SBA). Section II provides descriptions and…

  9. Verification of voltage/frequency requirement for emergency diesel generator in nuclear power plant using dynamic modeling

    NASA Astrophysics Data System (ADS)

    Hur, Jin-Suk; Roh, Myung-Sub

    2014-02-01

    One major cause of the plant shutdown is the loss of electrical power. The study is to comprehend the coping action against station blackout including emergency diesel generator, sequential loading of safety system and to ensure that the emergency diesel generator should meet requirements, especially voltage and frequency criteria using modeling tool. This paper also considered the change of the sequencing time and load capacity only for finding electrical design margin. However, the revision of load list must be verified with safety analysis. From this study, it is discovered that new load calculation is a key factor in EDG localization and in-house capability increase.

  10. The EcoCyc database: reflecting new knowledge about Escherichia coli K-12.

    PubMed

    Keseler, Ingrid M; Mackie, Amanda; Santos-Zavaleta, Alberto; Billington, Richard; Bonavides-Martínez, César; Caspi, Ron; Fulcher, Carol; Gama-Castro, Socorro; Kothari, Anamika; Krummenacker, Markus; Latendresse, Mario; Muñiz-Rascado, Luis; Ong, Quang; Paley, Suzanne; Peralta-Gil, Martin; Subhraveti, Pallavi; Velázquez-Ramírez, David A; Weaver, Daniel; Collado-Vides, Julio; Paulsen, Ian; Karp, Peter D

    2017-01-04

    EcoCyc (EcoCyc.org) is a freely accessible, comprehensive database that collects and summarizes experimental data for Escherichia coli K-12, the best-studied bacterial model organism. New experimental discoveries about gene products, their function and regulation, new metabolic pathways, enzymes and cofactors are regularly added to EcoCyc. New SmartTable tools allow users to browse collections of related EcoCyc content. SmartTables can also serve as repositories for user- or curator-generated lists. EcoCyc now supports running and modifying E. coli metabolic models directly on the EcoCyc website. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Polimedication: applicability of a computer tool to reduce polypharmacy in nursing homes.

    PubMed

    García-Caballero, Tomás M; Lojo, Juan; Menéndez, Carlos; Fernández-Álvarez, Roberto; Mateos, Raimundo; Garcia-Caballero, Alejandro

    2018-05-11

    ABSTRACTBackground:The risks of polypharmacy can be far greater than the benefits, especially in the elderly. Comorbidity makes polypharmacy very prevalent in this population; thus, increasing the occurrence of adverse effects. To solve this problem, the most common strategy is to use lists of potentially inappropriate medications. However, this strategy is time consuming. In order to minimize the expenditure of time, our group devised a pilot computer tool (Polimedication) that automatically processes lists of medication providing the corresponding Screening Tool of Older Persons' potentially inappropriate Prescriptions alerts and facilitating standardized reports. The drug lists for 115 residents in Santa Marta Nursing Home (Fundación San Rosendo, Ourense, Spain) were processed. The program detected 10.04 alerts/patient, of which 74.29% were not repeated. After reviewing these alerts, 12.12% of the total (1.30 alerts/patient) were considered relevant. The largest number of alerts (41.48%) involved neuroleptic drugs. Finally, the patient's family physician or psychiatrist accepted the alert and made medication changes in 62.86% of the relevant alerts. The largest number of changes (38.64%) also involved neuroleptic drugs. The mean time spent in the generation and review of the warnings was 6.26 minute/patient. Total changes represented a saving of 32.77 € per resident/year in medication. The application of Polimedication tool detected a high proportion of potentially inappropriate prescriptions in institutionalized elderly patients. The use of the computerized tool achieved significant savings in pharmaceutical expenditure, as well as a reduction in the time taken for medication review.

  12. Rapid Development of Specialty Population Registries and Quality Measures from Electronic Health Record Data*. An Agile Framework.

    PubMed

    Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L

    2017-06-14

    Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often.

  13. Rapid Development of Specialty Population Registries and Quality Measures from Electronic Health Record Data.

    PubMed

    Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L

    2017-01-01

    Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-gener-ated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. Schattauer GmbH.

  14. Aperture Photometry Tool

    NASA Astrophysics Data System (ADS)

    Laher, Russ R.; Gorjian, Varoujan; Rebull, Luisa M.; Masci, Frank J.; Fowler, John W.; Helou, George; Kulkarni, Shrinivas R.; Law, Nicholas M.

    2012-07-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It is a graphical user interface (GUI) designed to allow the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. The finely tuned layout of the GUI, along with judicious use of color-coding and alerting, is intended to give maximal user utility and convenience. Simply mouse-clicking on a source in the displayed image will instantly draw a circular or elliptical aperture and sky annulus around the source and will compute the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs with just the push of a button, including image histogram, x and y aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has many functions for customizing the calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, which is accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  15. Cutting costs: the impact of price lists on the cost development at the emergency department.

    PubMed

    Schilling, Ulf Martin

    2010-12-01

    It was shown that physicians working at the Swedish emergency department (ED) are unaware of the costs for investigations performed. This study evaluated the possible impact of price lists on the overall laboratory and radiology costs at the ED of a Swedish university hospital. Price lists including the most common laboratory analyses and radiological investigations at the ED were created. The lists were distributed to all internal medicine physicians by e-mail and exposed above their working stations continually. No lists were provided for the orthopaedic control group. The average costs for laboratory and radiological investigations during the months of June and July 2007 and 2008 were calculated. Neither clinical nor admission procedures were changed. The physicians were blinded towards the study. Statistical analysis was performed using the Student's t-test. A total of 1442 orthopaedic and 1585 medical patients were attended to in 2007. In 2008, 1467 orthopaedic and 1637 medical patients required emergency service. The average costs per patient were 980.27 SKR (98€)/999.41 SKR (100€, +1.95%) for orthopaedic and 1081.36 SKR (108€)/877.3 SKR (88€, -18.8%) for medical patients. Laboratory costs decreased by 9% in orthopaedic and 21.4% in medical patients. Radiology costs changed +5.4% in orthopaedic and -20.59% in medical patients. The distribution and promotion of price lists as a tool at the ED to heighten cost awareness resulted in a major decrease in the investigation costs. A significant decrease in radiological costs could be observed. It can be concluded that price lists are an effective tool to cut costs in public healthcare.

  16. Multimedia Tools for Teaching Economics.

    ERIC Educational Resources Information Center

    Pereira-Ford, Clara V.

    1998-01-01

    Describes one professor's experience in researching the use of multimedia tools for teaching principles of economics. Provides a list of resources consulted, including universities and colleges, books, software, laserdiscs and VHS tapes, Web sites, and journal sources. Found the students generally to be receptive to the introduction of new tools…

  17. An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls

    NASA Technical Reports Server (NTRS)

    Walker, G. P.; Wagner, E. A.; Bodden, D. S.

    1996-01-01

    This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.

  18. Border Lookout: Enhancing Tuberculosis Control on the United States-Mexico Border.

    PubMed

    DeSisto, Carla; Broussard, Kelly; Escobedo, Miguel; Borntrager, Denise; Alvarado-Ramy, Francisco; Waterman, Stephen

    2015-10-01

    We evaluated the use of federal public health intervention tools known as the Do Not Board and Border Lookout (BL) for detecting and referring infectious or potentially infectious land border travelers with tuberculosis (TB) back to treatment. We used data about the issuance of BL from April 2007 to September 2013 to examine demographics and TB laboratory results for persons on the list (N = 66) and time on the list before being located and achieving noninfectious status. The majority of case-patients were Hispanic and male, with a median age of 39 years. Most were citizens of the United States or Mexico, and 30.3% were undocumented migrants. One-fifth had multidrug-resistant TB. Nearly two-thirds of case-patients were located and treated as a result of being placed on the list. However, 25.8% of case-patients, primarily undocumented migrants, remain lost to follow-up and remain on the list. For this highly mobile patient population, the use of this novel federal travel intervention tool facilitated the detection and treatment of infectious TB cases that were lost to follow-up. © The American Society of Tropical Medicine and Hygiene.

  19. Propulsion Technology Lifecycle Operational Analysis

    NASA Technical Reports Server (NTRS)

    Robinson, John W.; Rhodes, Russell E.

    2010-01-01

    The paper presents the results of a focused effort performed by the members of the Space Propulsion Synergy Team (SPST) Functional Requirements Sub-team to develop propulsion data to support Advanced Technology Lifecycle Analysis System (ATLAS). This is a spreadsheet application to analyze the impact of technology decisions at a system-of-systems level. Results are summarized in an Excel workbook we call the Technology Tool Box (TTB). The TTB provides data for technology performance, operations, and programmatic parameters in the form of a library of technical information to support analysis tools and/or models. The lifecycle of technologies can be analyzed from this data and particularly useful for system operations involving long running missions. The propulsion technologies in this paper are listed against Chemical Rocket Engines in a Work Breakdown Structure (WBS) format. The overall effort involved establishing four elements: (1) A general purpose Functional System Breakdown Structure (FSBS). (2) Operational Requirements for Rocket Engines. (3) Technology Metric Values associated with Operating Systems (4) Work Breakdown Structure (WBS) of Chemical Rocket Engines The list of Chemical Rocket Engines identified in the WBS is by no means complete. It is planned to update the TTB with a more complete list of available Chemical Rocket Engines for United States (US) engines and add the Foreign rocket engines to the WBS which are available to NASA and the Aerospace Industry. The Operational Technology Metric Values were derived by the SPST Sub-team in the form of the TTB and establishes a database for users to help evaluate and establish the technology level of each Chemical Rocket Engine in the database. The Technology Metric Values will serve as a guide to help determine which rocket engine to invest technology money in for future development.

  20. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  1. Preferences of Turkish Language Teachers for the Assessment-Evaluation Tools and Methods

    ERIC Educational Resources Information Center

    Guney, Nail

    2013-01-01

    The aim of this study is to determine the rate of teachers' use of assessment and evaluation tools given in 2005 curriculum of Turkish language teaching. To this end; we presented a list of assessment and evaluation tools on the basis of random sampling to 216 teachers of Turkish who work in Ordu, Samsun, Ankara, Trabzon and Istanbul provinces.…

  2. Acute toxicity prediction to threatened and endangered ...

    EPA Pesticide Factsheets

    Evaluating contaminant sensitivity of threatened and endangered (listed) species and protectiveness of chemical regulations often depends on toxicity data for commonly tested surrogate species. The U.S. EPA’s Internet application Web-ICE is a suite of Interspecies Correlation Estimation (ICE) models that can extrapolate species sensitivity to listed taxa using least-squares regressions of the sensitivity of a surrogate species and a predicted taxon (species, genus, or family). Web-ICE was expanded with new models that can predict toxicity to over 250 listed species. A case study was used to assess protectiveness of genus and family model estimates derived from either geometric mean or minimum taxa toxicity values for listed species. Models developed from the most sensitive value for each chemical were generally protective of the most sensitive species within predicted taxa, including listed species, and were more protective than geometric means models. ICE model estimates were compared to HC5 values derived from Species Sensitivity Distributions for the case study chemicals to assess protectiveness of the two approaches. ICE models provide robust toxicity predictions and can generate protective toxicity estimates for assessing contaminant risk to listed species. Reporting on the development and optimization of ICE models for listed species toxicity estimation

  3. Tool & Die Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This document contains 23 units to consider for use in a tech prep competency profile for the occupation of tool and die technician. All the units listed will not necessarily apply to every situation or tech prep consortium, nor will all the competencies within each unit be appropriate. Several units appear within each specific occupation and…

  4. Educational Aspirations of Twentieth-Century American Females: A Bibliographic Essay.

    ERIC Educational Resources Information Center

    Miyamoto, Mary Huston

    1979-01-01

    Identifies research tools comprising a wide variety of materials from many disciplines which are available for exploring changes in educational aspirations among twentieth century females in the United States. A comprehensive list of these tools is provided and problems involved in accessing and using them are discussed. (EJS)

  5. Assessing and Managing Caregiver Stress: Development of a Teaching Tool for Medical Residents

    ERIC Educational Resources Information Center

    Famakinwa, Abisola; Fabiny, Anne

    2008-01-01

    Forty medical residents from major teaching hospitals in Boston, Massachusetts, participated in small group teaching sessions about caregiver stress. A teaching tool was developed that included a teaching handout, interactive cases, standard instruments for assessing caregiver stress, peer-reviewed articles about caregiving, and a list of…

  6. Life Cycle Assessment as an Environmental Management Tool

    EPA Science Inventory

    Listed by Time Magazine as the method behind calculating “Ecological Intelligence,” one of “10 Ideas Changing the World Right Now” (March 23, 2009), Life Cycle Assessment (LCA) is the tool that is used to understand the environmental impacts of the products we make and sell. Jo...

  7. Basic Technology Tools for Administrators: Preparing for the New Millennium.

    ERIC Educational Resources Information Center

    Aguilera, Raymond; Hendricks, Joen M.

    This paper suggests activities for school administrators to learn basic technology tools. Step-by-step instructions are provided for browsing and using the Internet, organizing favorite World Wide Web sites, and organizing Internet bookmarks. Interesting job search, legal, and professional organization Web sites for administrators are listed. A…

  8. A cascading failure analysis tool for post processing TRANSCARE simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less

  9. [Work-related stress and psychological distress assessment in urban and suburban public transportation companies].

    PubMed

    Romeo, L; Lazzarini, G; Farisè, E; Quintarelli, E; Riolfi, A; Perbellini, L

    2012-01-01

    The risk of work-related stress has been determined in bus drivers and workers employed in the service department of two urban and suburban public transportation companies. The INAIL evaluation method (Check list and HSE indicator tool) was used. The GHQ-12 questionnaire, which is widely used to assess the level of psychological distress, was also employed. 81.9% of workers involved in the survey answered both the HSE indicator tool and the GHQ-12 questionnaire. The Check list evaluation showed an increase in quantifiable company stress indicators while close examination using the HSE indicator tool demonstrated critical situations for all the subscales, with the control subscales more problematic in bus drivers. The demand, manager's support, relationships and change subscales were most associated with psychological distress in bus drivers, while relationships, role, change and demand subscales were negatively related in workers of the service department.

  10. Effects of deworming on child and maternal health: a literature review and meta-analysis.

    PubMed

    Thayer, Winter Maxwell; Clermont, Adrienne; Walker, Neff

    2017-11-07

    Soil-transmitted helminth infections are widespread. Many studies have been published on the topic of deworming. The Lives Saved Tool (LiST) is a software package that uses a deterministic mathematical model to estimate the effect of scaling up interventions on maternal and child health outcomes. This review investigates the scope of available evidence for benefits of deworming treatments in order to inform a decision about possible inclusion of deworming as an intervention in LiST. We searched PubMed, the Cochrane Library, and Google Scholar. We included studies that reported pre/post data in children younger than 5 years or pregnant women for outcomes related to mortality and growth. We excluded studies that compared different anthelminthic treatments but did not include a placebo or non-treatment group, and those that did not report post-intervention outcomes. We categorized articles by treated population (children younger than 5 years and pregnant women), experimental versus observational, mass drug administration (MDA) versus treatment, and reported outcome. We identified 58 relevant trials; 27 investigated children younger than 5 years and 11 investigated pregnant women; one reported on both children younger than 5 years and pregnant women. We conducted meta-analyses of relevant outcomes in children younger than 5 years. Deworming did not show consistent benefits for indicators of mortality, anemia, or growth in children younger than five or women of reproductive age. We do not recommend including the effect of deworming in the LiST model.

  11. Electronic problem lists: a thematic analysis of a systematic literature review to identify aspects critical to success.

    PubMed

    Hodge, Chad M; Narus, Scott P

    2018-05-01

    Problem list data is a driving force for many beneficial clinical tools, yet these data remain underutilized. We performed a systematic literature review, pulling insights from previous research, aggregating insights into themes, and distilling themes into actionable advice. We sought to learn what changes we could make to existing applications, to the clinical workflow, and to clinicians' perceptions that would improve problem list utilization and increase the prevalence of problems data in the electronic medical record. We followed Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines to systematically curate a corpus of pertinent articles. We performed a thematic analysis, looking for interesting excerpts and ideas. By aggregating excerpts from many authors, we gained broader, more inclusive insights into what makes a good problem list and what factors are conducive to its success. Analysis led to a list of 7 benefits of using the problem list, 15 aspects critical to problem list success, and knowledge to help inform policy development, such as consensus on what belongs on the problem list, who should maintain the problem list, and when. A list of suggestions is made on ways in which the problem list can be improved to increase utilization by clinicians. There is also a need for standard measurements of the problem list, so that lists can be measured, compared, and discussed with rigor and a common vocabulary.

  12. Canada's Patented Medicines (Notice of Compliance) Proceedings and Intellectual Property.

    PubMed

    Bian, Henry; McCourt, Conor

    2015-01-08

    Canada's Patent Register is a tool created by the Patented Medicines (Notice of Compliance) Regulations to help innovators protect their inventions relating to pharmaceuticals. This tool exists at the intersection between the intellectual property and drug approval regimes. By listing a patent on the Patent Register, an innovator can prevent a generic manufacturer from entering the marketplace rather than having to wait for his or her patent to be infringed. This article provides information on the requirements for listing a patent on the Patent Register and an overview of how the Patent Medicines (Notice of Compliance) Regulations affect the drug approval process. Copyright © 2015 Cold Spring Harbor Laboratory Press; all rights reserved.

  13. Development of a Tool to Identify Problems Related to Medication Adherence in Home Healthcare Patients.

    PubMed

    Mahan, Kathryn R; Clark, Jeffrey A; Anderson, Kurt D; Koller, Nolan J; Gates, Brian J

    2017-05-01

    In the home healthcare setting, clinicians are required to evaluate patient's medication therapy, including adherence. To facilitate this conversation, a pilot question list to help uncover potential medication nonadherence was created after completing a review of the literature and ascertaining the common themes as to why patients were nonadherent to their medication therapies. Pharmacy personnel who provide onsite consultations in a home healthcare setting used the question list to identify medication-related problems that could contribute to nonadherence and to document potential solutions. Through pharmacist-patient interactions, which occurred after admission to the home healthcare agency, pharmacy personnel found on average 2.3 issues per patient, which could affect medication adherence. Side effects were the most common problem identified. After this tool was tested with 65 patient interviews, the questions were analyzed and condensed into a shorter list more specific to the identification of medication-related problems for use by home care clinicians.

  14. Toward a Responsibility-Catering Prioritarian Ethical Theory of Risk.

    PubMed

    Wikman-Svahn, Per; Lindblom, Lars

    2018-03-05

    Standard tools used in societal risk management such as probabilistic risk analysis or cost-benefit analysis typically define risks in terms of only probabilities and consequences and assume a utilitarian approach to ethics that aims to maximize expected utility. The philosopher Carl F. Cranor has argued against this view by devising a list of plausible aspects of the acceptability of risks that points towards a non-consequentialist ethical theory of societal risk management. This paper revisits Cranor's list to argue that the alternative ethical theory responsibility-catering prioritarianism can accommodate the aspects identified by Cranor and that the elements in the list can be used to inform the details of how to view risks within this theory. An approach towards operationalizing the theory is proposed based on a prioritarian social welfare function that operates on responsibility-adjusted utilities. A responsibility-catering prioritarian ethical approach towards managing risks is a promising alternative to standard tools such as cost-benefit analysis.

  15. Listmania. How lists can open up fresh possibilities for research in the history of science.

    PubMed

    Delbourgo, James; Müller-Wille, Staffan

    2012-12-01

    Anthropologists, linguists, cultural historians, and literary scholars have long emphasized the value of examining writing as a material practice and have often invoked the list as a paradigmatic example thereof. This Focus section explores how lists can open up fresh possibilities for research in the history of science. Drawing on examples from the early modern period, the contributors argue that attention to practices of list making reveals important relations between mercantile, administrative, and scientific attempts to organize the contents of the world. Early modern lists projected both spatial and temporal visions of nature: they inventoried objects in the process of exchange and collection; they projected possible trajectories for future endeavor; they publicized the social identities of scientific practitioners; and they became research tools that transformed understandings of the natural order.

  16. Towards a systems approach for understanding honeybee decline: a stocktaking and synthesis of existing models

    PubMed Central

    Becher, Matthias A; Osborne, Juliet L; Thorbek, Pernille; Kennedy, Peter J; Grimm, Volker

    2013-01-01

    The health of managed and wild honeybee colonies appears to have declined substantially in Europe and the United States over the last decade. Sustainability of honeybee colonies is important not only for honey production, but also for pollination of crops and wild plants alongside other insect pollinators. A combination of causal factors, including parasites, pathogens, land use changes and pesticide usage, are cited as responsible for the increased colony mortality. However, despite detailed knowledge of the behaviour of honeybees and their colonies, there are no suitable tools to explore the resilience mechanisms of this complex system under stress. Empirically testing all combinations of stressors in a systematic fashion is not feasible. We therefore suggest a cross-level systems approach, based on mechanistic modelling, to investigate the impacts of (and interactions between) colony and land management. We review existing honeybee models that are relevant to examining the effects of different stressors on colony growth and survival. Most of these models describe honeybee colony dynamics, foraging behaviour or honeybee – varroa mite – virus interactions. We found that many, but not all, processes within honeybee colonies, epidemiology and foraging are well understood and described in the models, but there is no model that couples in-hive dynamics and pathology with foraging dynamics in realistic landscapes. Synthesis and applications. We describe how a new integrated model could be built to simulate multifactorial impacts on the honeybee colony system, using building blocks from the reviewed models. The development of such a tool would not only highlight empirical research priorities but also provide an important forecasting tool for policy makers and beekeepers, and we list examples of relevant applications to bee disease and landscape management decisions. PMID:24223431

  17. Towards a systems approach for understanding honeybee decline: a stocktaking and synthesis of existing models.

    PubMed

    Becher, Matthias A; Osborne, Juliet L; Thorbek, Pernille; Kennedy, Peter J; Grimm, Volker

    2013-08-01

    The health of managed and wild honeybee colonies appears to have declined substantially in Europe and the United States over the last decade. Sustainability of honeybee colonies is important not only for honey production, but also for pollination of crops and wild plants alongside other insect pollinators. A combination of causal factors, including parasites, pathogens, land use changes and pesticide usage, are cited as responsible for the increased colony mortality.However, despite detailed knowledge of the behaviour of honeybees and their colonies, there are no suitable tools to explore the resilience mechanisms of this complex system under stress. Empirically testing all combinations of stressors in a systematic fashion is not feasible. We therefore suggest a cross-level systems approach, based on mechanistic modelling, to investigate the impacts of (and interactions between) colony and land management.We review existing honeybee models that are relevant to examining the effects of different stressors on colony growth and survival. Most of these models describe honeybee colony dynamics, foraging behaviour or honeybee - varroa mite - virus interactions.We found that many, but not all, processes within honeybee colonies, epidemiology and foraging are well understood and described in the models, but there is no model that couples in-hive dynamics and pathology with foraging dynamics in realistic landscapes. Synthesis and applications . We describe how a new integrated model could be built to simulate multifactorial impacts on the honeybee colony system, using building blocks from the reviewed models. The development of such a tool would not only highlight empirical research priorities but also provide an important forecasting tool for policy makers and beekeepers, and we list examples of relevant applications to bee disease and landscape management decisions.

  18. New and Promising: Software Worth a Look. A MicroSIFT Survey of Educational Software Preview Center Coordinators. Volume II, No. 2.

    ERIC Educational Resources Information Center

    Podany, Zita

    This guide lists 19 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 17 computer coordinators from educational software preview centers and evaluation agencies. The following software is listed: (1) ASK-IT, an authoring tool; (2) Balance of the Planet, an environmental…

  19. Comparing Traditional Journal Writing with Journal Writing Shared over E-mail List Serves as Tools for Facilitating Reflective Thinking: A Study of Preservice Teachers

    ERIC Educational Resources Information Center

    Kaplan, Diane S.; Rupley, William H.; Sparks, Joanne; Holcomb, Angelia

    2007-01-01

    To determine the conditions that would best encourage reflection in journal writing of preservice teachers in field-based reading internships, the degree of reflective content found in self-contained traditional journals was compared to the reflective content found in journal entries shared over e-mail list serves. Participants were 56 preservice…

  20. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  1. Parallelizing Timed Petri Net simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1993-01-01

    The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.

  2. An Annotated Reading List for Concurrent Engineering

    DTIC Science & Technology

    1989-07-01

    The seven tools are sometimes referred to as the seven old tools.) -9- Ishikawa , Kaoru , What is Total Quality Control? The Japanese Way, Prentice-Hall...some solutions. * Ishikawa (1982) presents a practical guide (with easy to use tools) for implementing qual- ity control at the working level...study of, :-, ieering for the last two years. Is..ikawa, Kaoru , Guide to Quality Control, Kraus International Publications, White Plains, NY, 1982. The

  3. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  4. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  5. ISAAC - InterSpecies Analysing Application using Containers.

    PubMed

    Baier, Herbert; Schultz, Jörg

    2014-01-15

    Information about genes, transcripts and proteins is spread over a wide variety of databases. Different tools have been developed using these databases to identify biological signals in gene lists from large scale analysis. Mostly, they search for enrichments of specific features. But, these tools do not allow an explorative walk through different views and to change the gene lists according to newly upcoming stories. To fill this niche, we have developed ISAAC, the InterSpecies Analysing Application using Containers. The central idea of this web based tool is to enable the analysis of sets of genes, transcripts and proteins under different biological viewpoints and to interactively modify these sets at any point of the analysis. Detailed history and snapshot information allows tracing each action. Furthermore, one can easily switch back to previous states and perform new analyses. Currently, sets can be viewed in the context of genomes, protein functions, protein interactions, pathways, regulation, diseases and drugs. Additionally, users can switch between species with an automatic, orthology based translation of existing gene sets. As todays research usually is performed in larger teams and consortia, ISAAC provides group based functionalities. Here, sets as well as results of analyses can be exchanged between members of groups. ISAAC fills the gap between primary databases and tools for the analysis of large gene lists. With its highly modular, JavaEE based design, the implementation of new modules is straight forward. Furthermore, ISAAC comes with an extensive web-based administration interface including tools for the integration of third party data. Thus, a local installation is easily feasible. In summary, ISAAC is tailor made for highly explorative interactive analyses of gene, transcript and protein sets in a collaborative environment.

  6. Integrated software environment based on COMKAT for analyzing tracer pharmacokinetics with molecular imaging.

    PubMed

    Fang, Yu-Hua Dean; Asthana, Pravesh; Salinas, Cristian; Huang, Hsuan-Ming; Muzic, Raymond F

    2010-01-01

    An integrated software package, Compartment Model Kinetic Analysis Tool (COMKAT), is presented in this report. COMKAT is an open-source software package with many functions for incorporating pharmacokinetic analysis in molecular imaging research and has both command-line and graphical user interfaces. With COMKAT, users may load and display images, draw regions of interest, load input functions, select kinetic models from a predefined list, or create a novel model and perform parameter estimation, all without having to write any computer code. For image analysis, COMKAT image tool supports multiple image file formats, including the Digital Imaging and Communications in Medicine (DICOM) standard. Image contrast, zoom, reslicing, display color table, and frame summation can be adjusted in COMKAT image tool. It also displays and automatically registers images from 2 modalities. Parametric imaging capability is provided and can be combined with the distributed computing support to enhance computation speeds. For users without MATLAB licenses, a compiled, executable version of COMKAT is available, although it currently has only a subset of the full COMKAT capability. Both the compiled and the noncompiled versions of COMKAT are free for academic research use. Extensive documentation, examples, and COMKAT itself are available on its wiki-based Web site, http://comkat.case.edu. Users are encouraged to contribute, sharing their experience, examples, and extensions of COMKAT. With integrated functionality specifically designed for imaging and kinetic modeling analysis, COMKAT can be used as a software environment for molecular imaging and pharmacokinetic analysis.

  7. Popularity and Novelty Dynamics in Evolving Networks.

    PubMed

    Abbas, Khushnood; Shang, Mingsheng; Abbasi, Alireza; Luo, Xin; Xu, Jian Jun; Zhang, Yu-Xia

    2018-04-20

    Network science plays a big role in the representation of real-world phenomena such as user-item bipartite networks presented in e-commerce or social media platforms. It provides researchers with tools and techniques to solve complex real-world problems. Identifying and predicting future popularity and importance of items in e-commerce or social media platform is a challenging task. Some items gain popularity repeatedly over time while some become popular and novel only once. This work aims to identify the key-factors: popularity and novelty. To do so, we consider two types of novelty predictions: items appearing in the popular ranking list for the first time; and items which were not in the popular list in the past time window, but might have been popular before the recent past time window. In order to identify the popular items, a careful consideration of macro-level analysis is needed. In this work we propose a model, which exploits item level information over a span of time to rank the importance of the item. We considered ageing or decay effect along with the recent link-gain of the items. We test our proposed model on four various real-world datasets using four information retrieval based metrics.

  8. Continual improvement: A bibliography with indexes, 1992-1993

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This bibliography lists 606 references to reports and journal articles entered into the NASA Scientific and Technical Information Database during 1992 to 1993. Topics cover the philosophy and history of Continual Improvement (CI), basic approaches and strategies for implementation, and lessons learned from public and private sector models. Entries are arranged according to the following categories: Leadership for Quality, Information and Analysis, Strategic Planning for CI, Human Resources Utilization, Management of Process Quality, Supplier Quality, Assessing Results, Customer Focus and Satisfaction, TQM Tools and Philosophies, and Applications. Indexes include subject, personal author, corporate source, contract number, report number, and accession number.

  9. Validity of a quantitative clinical measurement tool of trunk posture in idiopathic scoliosis.

    PubMed

    Fortin, Carole; Feldman, Debbie E; Cheriet, Farida; Labelle, Hubert

    2010-09-01

    Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.

  10. The Use of Climatic Niches in Screening Procedures for Introduced Species to Evaluate Risk of Spread: A Case with the American Eastern Grey Squirrel

    PubMed Central

    Di Febbraro, Mirko; Lurz, Peter W. W.; Genovesi, Piero; Maiorano, Luigi; Girardello, Marco; Bertolino, Sandro

    2013-01-01

    Species introduction represents one of the most serious threats for biodiversity. The realized climatic niche of an invasive species can be used to predict its potential distribution in new areas, providing a basis for screening procedures in the compilation of black and white lists to prevent new introductions. We tested this assertion by modeling the realized climatic niche of the Eastern grey squirrel Sciurus carolinensis. Maxent was used to develop three models: one considering only records from the native range (NRM), a second including records from native and invasive range (NIRM), a third calibrated with invasive occurrences and projected in the native range (RCM). Niche conservatism was tested considering both a niche equivalency and a niche similarity test. NRM failed to predict suitable parts of the currently invaded range in Europe, while RCM underestimated the suitability in the native range. NIRM accurately predicted both the native and invasive range. The niche equivalency hypothesis was rejected due to a significant difference between the grey squirrel’s niche in native and invasive ranges. The niche similarity test yielded no significant results. Our analyses support the hypothesis of a shift in the species’ climatic niche in the area of introductions. Species Distribution Models (SDMs) appear to be a useful tool in the compilation of black lists, allowing identifying areas vulnerable to invasions. We advise caution in the use of SDMs based only on the native range of a species for the compilation of white lists for other geographic areas, due to the significant risk of underestimating its potential invasive range. PMID:23843957

  11. GALEN: a third generation terminology tool to support a multipurpose national coding system for surgical procedures.

    PubMed

    Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H

    2000-09-01

    Generalised architecture for languages, encyclopedia and nomenclatures in medicine (GALEN) has developed a new generation of terminology tools based on a language independent model describing the semantics and allowing computer processing and multiple reuses as well as natural language understanding systems applications to facilitate the sharing and maintaining of consistent medical knowledge. During the European Union 4 Th. framework program project GALEN-IN-USE and later on within two contracts with the national health authorities we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures named CCAM in a minority language country, France. On one hand, we contributed to a language independent knowledge repository and multilingual semantic dictionaries for multicultural Europe. On the other hand, we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW (for classification workbench) to process French professional medical language rubrics produced by the national colleges of surgeons domain experts into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation, on one hand, we generate with the LNAT natural language generator controlled French natural language to support the finalization of the linguistic labels (first generation) in relation with the meanings of the conceptual system structure. On the other hand, the Claw classification manager proves to be very powerful to retrieve the initial domain experts rubrics list with different categories of concepts (second generation) within a semantic structured representation (third generation) bridge to the electronic patient record detailed terminology.

  12. Effect of preventive zinc supplementation on linear growth in children under 5 years of age in developing countries: a meta-analysis of studies for input to the lives saved tool

    PubMed Central

    2011-01-01

    Introduction Zinc plays an important role in cellular growth, cellular differentiation and metabolism. The results of previous meta-analyses evaluating effect of zinc supplementation on linear growth are inconsistent. We have updated and evaluated the available evidence according to Grading of Recommendations, Assessment, Development and Evaluation (GRADE) criteria and tried to explain the difference in results of the previous reviews. Methods A literature search was done on PubMed, Cochrane Library, IZiNCG database and WHO regional data bases using different terms for zinc and linear growth (height). Data were abstracted in a standardized form. Data were analyzed in two ways i.e. weighted mean difference (effect size) and pooled mean difference for absolute increment in length in centimeters. Random effect models were used for these pooled estimates. We have given our recommendations for effectiveness of zinc supplementation in the form of absolute increment in length (cm) in zinc supplemented group compared to control for input to Live Saves Tool (LiST). Results There were thirty six studies assessing the effect of zinc supplementation on linear growth in children < 5 years from developing countries. In eleven of these studies, zinc was given in combination with other micronutrients (iron, vitamin A, etc). The final effect size after pooling all the data sets (zinc ± iron etc) showed a significant positive effect of zinc supplementation on linear growth [Effect size: 0.13 (95% CI 0.04, 0.21), random model] in the developing countries. A subgroup analysis by excluding those data sets where zinc was supplemented in combination with iron showed a more pronounced effect of zinc supplementation on linear growth [Weighed mean difference 0.19 (95 % CI 0.08, 0.30), random model]. A subgroup analysis from studies that reported actual increase in length (cm) showed that a dose of 10 mg zinc/day for duration of 24 weeks led to a net a gain of 0.37 (±0.25) cm in zinc supplemented group compared to placebo. This estimate is recommended for inclusion in Lives Saved Tool (LiST) model. Conclusions Zinc supplementation has a significant positive effect on linear growth, especially when administered alone, and should be included in national strategies to reduce stunting in children < 5 years of age in developing countries. PMID:21501440

  13. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation

    PubMed Central

    Sherfey, Jason S.; Soplata, Austin E.; Ardid, Salva; Roberts, Erik A.; Stanley, David A.; Pittman-Polletta, Benjamin R.; Kopell, Nancy J.

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community. PMID:29599715

  14. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    PubMed

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  15. NetActivism: How Citizens Use the Internet. First Edition.

    ERIC Educational Resources Information Center

    Schwartz, Edward

    This book guides citizens in using the Internet for community, social, and political action. Following an in-depth introduction, chapters include: Chapter 1, "Getting Connected" and Chapter 2, "Tools," explain the two Internet tools central to organizing for activism--electronic mail lists and the World Wide Web, and the hardware and software…

  16. Welding. Module 8 of the Vocational Education Readiness Test (VERT).

    ERIC Educational Resources Information Center

    Thomas, Edward L., Comp.

    Focusing on welding, this module is one of eight included in the Vocational Education Readiness Tests (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computational skills,…

  17. Basic Wiring. Module 2 of the Vocational Education Readiness Test (VERT).

    ERIC Educational Resources Information Center

    Thomas, Edward L., Comp.

    Focusing on basic welding, this module is one of eight included in the Vocational Education Readiness Test (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computation…

  18. The Use of Hand Tools in Agricultural Mechanics.

    ERIC Educational Resources Information Center

    Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.

    This document contains a unit for teaching the use of hand tools in agricultural mechanics in Montana. It consists of an outline of the unit and seven lesson plans. The unit outline contains the following components: situation, aims and goals, list of lessons, student activities, teacher activities, special equipment needed, and references. The…

  19. Masonry. Module 5 of the Vocational Education Readiness Test (VERT).

    ERIC Educational Resources Information Center

    Thomas, Edward L., Comp.

    Focusing on masonry, this module is one of eight included in the Vocational Education Readiness Tests (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computational skills,…

  20. An Empirical Derivation of the Run Time of the Bubble Sort Algorithm.

    ERIC Educational Resources Information Center

    Gonzales, Michael G.

    1984-01-01

    Suggests a moving pictorial tool to help teach principles in the bubble sort algorithm. Develops such a tool applied to an unsorted list of numbers and describes a method to derive the run time of the algorithm. The method can be modified to run the times of various other algorithms. (JN)

  1. Clinical Application of Electrocardiography.

    ERIC Educational Resources Information Center

    Brammell, H. L.; Orr, William

    The scalar electrocardiogram (ECG) is one of the most important and commonly used clinical tools in medicine. A detailed description of the recordings of cardiac electrical activity made by the ECG is presented, and the vast numbers of uses made with the data provided by this diagnostic tool are cited. Clinical applications of the ECG are listed.…

  2. PubFinder: a tool for improving retrieval rate of relevant PubMed abstracts.

    PubMed

    Goetz, Thomas; von der Lieth, Claus-Wilhelm

    2005-07-01

    Since it is becoming increasingly laborious to manually extract useful information embedded in the ever-growing volumes of literature, automated intelligent text analysis tools are becoming more and more essential to assist in this task. PubFinder (www.glycosciences.de/tools/PubFinder) is a publicly available web tool designed to improve the retrieval rate of scientific abstracts relevant for a specific scientific topic. Only the selection of a representative set of abstracts is required, which are central for a scientific topic. No special knowledge concerning the query-syntax is necessary. Based on the selected abstracts, a list of discriminating words is automatically calculated, which is subsequently used for scoring all defined PubMed abstracts for their probability of belonging to the defined scientific topic. This results in a hit-list of references in the descending order of their likelihood score. The algorithms and procedures implemented in PubFinder facilitate the perpetual task for every scientist of staying up-to-date with current publications dealing with a specific subject in biomedicine.

  3. Tools for evaluating Veterinary Services: an external auditing model for the quality assurance process.

    PubMed

    Melo, E Correa

    2003-08-01

    The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.

  4. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  5. dSED: A database tool for modeling sediment early diagenesis

    NASA Astrophysics Data System (ADS)

    Katsev, S.; Rancourt, D. G.; L'Heureux, I.

    2003-04-01

    Sediment early diagenesis reaction transport models (RTMs) are becoming powerful tools in providing kinetic descriptions of the metal and nutrient diagenetic cycling in marine, lacustrine, estuarine, and other aquatic sediments, as well as of exchanges with the water column. Whereas there exist several good database/program combinations for thermodynamic equilibrium calculations in aqueous systems, at present there exist no database tools for classification and analysis of the kinetic data essential to RTM development. We present a database tool that is intended to serve as an online resource for information about chemical reactions, solid phase and solute reactants, sorption reactions, transport mechanisms, and kinetic and equilibrium parameters that are relevant to sediment diagenesis processes. The list of reactive substances includes but is not limited to organic matter, Fe and Mn oxides and oxyhydroxides, sulfides and sulfates, calcium, iron, and manganese carbonates, phosphorus-bearing minerals, and silicates. Aqueous phases include dissolved carbon dioxide, oxygen, methane, hydrogen sulfide, sulfate, nitrate, phosphate, some organic compounds, and dissolved metal species. A number of filters allow extracting information according to user-specified criteria, e.g., about a class of substances contributing to the cycling of iron. The database also includes bibliographic information about published diagenetic models and the reactions and processes that they consider. At the time of preparing this abstract, dSED contained 128 reactions and 12 pre-defined filters. dSED is maintained by the Lake Sediment Structure and Evolution (LSSE) group at the University of Ottawa (www.science.uottawa.ca/LSSE/dSED) and we invite input from the geochemical community.

  6. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.

    PubMed

    MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J

    2010-04-01

    Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.

  7. An Investigation of Two Finite Element Modeling Solutions for Biomechanical Simulation Using a Case Study of a Mandibular Bone.

    PubMed

    Liu, Yun-Feng; Fan, Ying-Ying; Dong, Hui-Yue; Zhang, Jian-Xing

    2017-12-01

    The method used in biomechanical modeling for finite element method (FEM) analysis needs to deliver accurate results. There are currently two solutions used in FEM modeling for biomedical model of human bone from computerized tomography (CT) images: one is based on a triangular mesh and the other is based on the parametric surface model and is more popular in practice. The outline and modeling procedures for the two solutions are compared and analyzed. Using a mandibular bone as an example, several key modeling steps are then discussed in detail, and the FEM calculation was conducted. Numerical calculation results based on the models derived from the two methods, including stress, strain, and displacement, are compared and evaluated in relation to accuracy and validity. Moreover, a comprehensive comparison of the two solutions is listed. The parametric surface based method is more helpful when using powerful design tools in computer-aided design (CAD) software, but the triangular mesh based method is more robust and efficient.

  8. A basic list of recommended books and journals for support of clinical dentistry in a nondental library.

    PubMed Central

    Johnson, R C; Mason, F O; Sims, R H

    1997-01-01

    A basic list of 133 book and journal titles in dentistry is presented. The list is intended as a bibliographic selection tool for those libraries and health institutions that support clinical dentistry programs and services in the nondental school environment in the United States and Canada. The book and journal titles were selected by the membership of the Dental Section of the Medical Library Association (MLA). The Dental Section membership represents dental and other health sciences libraries and dental research institutions from the United States and Canada, as well as from other countries. The list was compiled and edited by the Ad Hoc Publications Committee of the Dental Section of MLA. The final list was reviewed and subsequently was approved for publication and distribution by the Dental Section of MLA during the section's 1996 annual meeting in Kansas City, Missouri. PMID:9285122

  9. Science responses to IUCN Red Listing.

    PubMed

    Jarić, Ivan; Roberts, David L; Gessner, Jörn; Solow, Andrew R; Courchamp, Franck

    2017-01-01

    The IUCN Red List of Threatened Species is often advocated as a tool to assist decision-making in conservation investment and research focus. It is frequently suggested that research efforts should prioritize species in higher threat categories and those that are Data Deficient (DD). We assessed the linkage between IUCN listing and research effort in DD and Critically Endangered (CR) species, two groups generally advocated as research priorities. The analysis of the change in the research output following species classification indicated a listing effect in DD species, while such effect was observed in only a minority of CR species groups. DD species, while chronically understudied, seem to be recognized as research priorities, while research effort for endangered species appears to be driven by various factors other than the IUCN listing. Optimized conservation research focus would require international science planning efforts, harmonized through international mechanisms and promoted by financial and other incentives.

  10. Species richness and variety of life in Arizona’s ponderosa pine forest type

    Treesearch

    David R. Patton; Richard W. Hofstetter; John D. Bailey; Mary Ann Benoit

    2014-01-01

    Species richness (SR) is a tool that managers can use to include diversity in planning and decision-making and is a convenient and useful way to characterize the first level of biological diversity. A richness list derived from existing inventories enhances a manager’s understanding of the complexity of the plant and animal communities they manage. Without a list of...

  11. A review of question prompt lists used in the oncology setting with comparison to the Patient Concerns Inventory.

    PubMed

    Miller, N; Rogers, S N

    2018-01-01

    A question prompt list (QPL) is a simple and inexpensive communication tool used to facilitate patient participation in medical consultations. The QPL is composed of a structured list of questions and has been shown to be an effective way of helping ensure patients' individual information needs are appropriately met. This intervention has been investigated in a variety of settings but not specifically head and neck cancer (HNC). The aim of this paper was to perform a narrative review of literature reporting the use of a QPL for oncology patients and to draw comparison to the Patient Concerns Inventory (PCI-HN). The databases Scopus, PubMed and MEDLINE were searched using the key terms 'question prompt list', 'question prompt sheet', 'cancer' and 'oncology'. Of 98 articles hand searched, 30 of which were found to meet all inclusion criteria, and described in a tabulated summary. The studies concluded that the QPL was an effective intervention, enabling active patient participation in medical consultations. The PCI-HN is specific for HNC and differs from many QPLs, which are more general cancer tools. The QPL approach should prove to be a useful intervention for HNC sufferers, however further research into the clinical utility is required. © 2016 John Wiley & Sons Ltd.

  12. Educational Services Officer

    DTIC Science & Technology

    1988-01-01

    publication is a a recommended reading list. A brief description valuable tool to an ESO when it is available of the subject matter of each is given, and...factors that must be considered in the arrangement of an office. 9. Identify the basic office products. 3. Identify the tools necessary for the...of the office. Personnel served by the tools of the trade; however, if the office this office judge it by the measure of does not provide

  13. Job title of recent bachelor's degree recipients

    NASA Astrophysics Data System (ADS)

    White, Susan C.

    2015-05-01

    Physics bachelor's degree recipients work in all kinds of professions—science writing, medicine, law, history of science, acting, music, healthcare and more. Since very few of these employees have the word "physics" in their job titles, it can be hard for new graduates to know where to look for jobs and how to find other recent physics graduates in the workforce. The American Institute of Physics and the Society of Physics Students joined forces on an NSF-funded grant to create career tools for undergraduate physics students.1 One of the tools available to students in the Careers Toolbox is a listing of common job titles of physics bachelors degree recipients working in various fields; some of the job titles are listed below.

  14. Towards decision support for waiting lists: an operations management view.

    PubMed

    Vissers, J M; Van Der Bij, J D; Kusters, R J

    2001-06-01

    This paper considers the phenomenon of waiting lists in a healthcare setting, which is characterised by limitations on the national expenditure, to explore the potentials of an operations management perspective. A reference framework for waiting list management is described, distinguishing different levels of planning in healthcare--national, regional, hospital and process--that each contributes to the existence of waiting lists through managerial decision making. In addition, different underlying mechanisms in demand and supply are distinguished, which together explain the development of waiting lists. It is our contention that within this framework a series of situation specific models should be designed to support communication and decision making. This is illustrated by the modelling of the demand for cataract treatment in a regional setting in the south-eastern part of the Netherlands. An input-output model was developed to support decisions regarding waiting lists. The model projects the demand for treatment at a regional level and makes it possible to evaluate waiting list impacts for different scenarios to meet this demand.

  15. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.

  16. DOT2: Macromolecular Docking With Improved Biophysical Models

    PubMed Central

    Roberts, Victoria A.; Thompson, Elaine E.; Pique, Michael E.; Perez, Martin S.; Eyck, Lynn Ten

    2015-01-01

    Computational docking is a useful tool for predicting macromolecular complexes, which are often difficult to determine experimentally. Here we present the DOT2 software suite, an updated version of the DOT intermolecular docking program. DOT2 provides straightforward, automated construction of improved biophysical models based on molecular coordinates, offering checkpoints that guide the user to include critical features. DOT has been updated to run more quickly, allow flexibility in grid size and spacing, and generate a complete list of favorable candidate configu-rations. Output can be filtered by experimental data and rescored by the sum of electrostatic and atomic desolvation energies. We show that this rescoring method improves the ranking of correct complexes for a wide range of macromolecular interactions, and demonstrate that biologically relevant models are essential for biologically relevant results. The flexibility and versatility of DOT2 accommodate realistic models of complex biological systems, improving the likelihood of a successful docking outcome. PMID:23695987

  17. Resource materials for a GIS spatial analysis course

    USGS Publications Warehouse

    Raines, Gary L.

    2001-01-01

    This report consists of materials prepared for a GIS spatial analysis course offered as part of the Geography curriculum at the University of Nevada, Reno and the University of California at Santa Barbara in the spring of 2000. The report is intended to share information with instructors preparing spatial-modeling training and scientists with advanced GIS expertise. The students taking this class had completed each universities GIS curriculum and had a foundation in statistics as part of a science major. This report is organized into chapters that contain the following: Slides used during lectures, Guidance on the use of Arcview, Introduction to filtering in Arcview, Conventional and spatial correlation in Arcview, Tools for fuzzification in Arcview, Data and instructions for creating using ArcSDM for simple weights-of-evidence, fuzzy logic, and neural network models for Carlin-type gold deposits in central Nevada, Reading list on spatial modeling, and Selected student spatial-modeling posters from the laboratory exercises.

  18. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less

  19. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  20. WebGIVI: a web-based gene enrichment analysis and visualization tool.

    PubMed

    Sun, Liang; Zhu, Yongnan; Mahmood, A S M Ashique; Tudor, Catalina O; Ren, Jia; Vijay-Shanker, K; Chen, Jian; Schmidt, Carl J

    2017-05-04

    A major challenge of high throughput transcriptome studies is presenting the data to researchers in an interpretable format. In many cases, the outputs of such studies are gene lists which are then examined for enriched biological concepts. One approach to help the researcher interpret large gene datasets is to associate genes and informative terms (iTerm) that are obtained from the biomedical literature using the eGIFT text-mining system. However, examining large lists of iTerm and gene pairs is a daunting task. We have developed WebGIVI, an interactive web-based visualization tool ( http://raven.anr.udel.edu/webgivi/ ) to explore gene:iTerm pairs. WebGIVI was built via Cytoscape and Data Driven Document JavaScript libraries and can be used to relate genes to iTerms and then visualize gene and iTerm pairs. WebGIVI can accept a gene list that is used to retrieve the gene symbols and corresponding iTerm list. This list can be submitted to visualize the gene iTerm pairs using two distinct methods: a Concept Map or a Cytoscape Network Map. In addition, WebGIVI also supports uploading and visualization of any two-column tab separated data. WebGIVI provides an interactive and integrated network graph of gene and iTerms that allows filtering, sorting, and grouping, which can aid biologists in developing hypothesis based on the input gene lists. In addition, WebGIVI can visualize hundreds of nodes and generate a high-resolution image that is important for most of research publications. The source code can be freely downloaded at https://github.com/sunliang3361/WebGIVI . The WebGIVI tutorial is available at http://raven.anr.udel.edu/webgivi/tutorial.php .

  1. Development of equally intelligible Telugu sentence-lists to test speech recognition in noise.

    PubMed

    Tanniru, Kishore; Narne, Vijaya Kumar; Jain, Chandni; Konadath, Sreeraj; Singh, Niraj Kumar; Sreenivas, K J Ramadevi; K, Anusha

    2017-09-01

    To develop sentence lists in the Telugu language for the assessment of speech recognition threshold (SRT) in the presence of background noise through identification of the mean signal-to-noise ratio required to attain a 50% sentence recognition score (SRTn). This study was conducted in three phases. The first phase involved the selection and recording of Telugu sentences. In the second phase, 20 lists, each consisting of 10 sentences with equal intelligibility, were formulated using a numerical optimisation procedure. In the third phase, the SRTn of the developed lists was estimated using adaptive procedures on individuals with normal hearing. A total of 68 native Telugu speakers with normal hearing participated in the study. Of these, 18 (including the speakers) performed on various subjective measures in first phase, 20 performed on sentence/word recognition in noise for second phase and 30 participated in the list equivalency procedures in third phase. In all, 15 lists of comparable difficulty were formulated as test material. The mean SRTn across these lists corresponded to -2.74 (SD = 0.21). The developed sentence lists provided a valid and reliable tool to measure SRTn in Telugu native speakers.

  2. Acute Toxicity Prediction to Threatened and Endangered Species Using Interspecies Correlation Estimation (ICE) Models.

    PubMed

    Willming, Morgan M; Lilavois, Crystal R; Barron, Mace G; Raimondo, Sandy

    2016-10-04

    Evaluating contaminant sensitivity of threatened and endangered (listed) species and protectiveness of chemical regulations often depends on toxicity data for commonly tested surrogate species. The U.S. EPA's Internet application Web-ICE is a suite of Interspecies Correlation Estimation (ICE) models that can extrapolate species sensitivity to listed taxa using least-squares regressions of the sensitivity of a surrogate species and a predicted taxon (species, genus, or family). Web-ICE was expanded with new models that can predict toxicity to over 250 listed species. A case study was used to assess protectiveness of genus and family model estimates derived from either geometric mean or minimum taxa toxicity values for listed species. Models developed from the most sensitive value for each chemical were generally protective of the most sensitive species within predicted taxa, including listed species, and were more protective than geometric means models. ICE model estimates were compared to HC5 values derived from Species Sensitivity Distributions for the case study chemicals to assess protectiveness of the two approaches. ICE models provide robust toxicity predictions and can generate protective toxicity estimates for assessing contaminant risk to listed species.

  3. PhytoCRISP-Ex: a web-based and stand-alone application to find specific target sequences for CRISPR/CAS editing.

    PubMed

    Rastogi, Achal; Murik, Omer; Bowler, Chris; Tirichine, Leila

    2016-07-01

    With the emerging interest in phytoplankton research, the need to establish genetic tools for the functional characterization of genes is indispensable. The CRISPR/Cas9 system is now well recognized as an efficient and accurate reverse genetic tool for genome editing. Several computational tools have been published allowing researchers to find candidate target sequences for the engineering of the CRISPR vectors, while searching possible off-targets for the predicted candidates. These tools provide built-in genome databases of common model organisms that are used for CRISPR target prediction. Although their predictions are highly sensitive, the applicability to non-model genomes, most notably protists, makes their design inadequate. This motivated us to design a new CRISPR target finding tool, PhytoCRISP-Ex. Our software offers CRIPSR target predictions using an extended list of phytoplankton genomes and also delivers a user-friendly standalone application that can be used for any genome. The software attempts to integrate, for the first time, most available phytoplankton genomes information and provide a web-based platform for Cas9 target prediction within them with high sensitivity. By offering a standalone version, PhytoCRISP-Ex maintains an independence to be used with any organism and widens its applicability in high throughput pipelines. PhytoCRISP-Ex out pars all the existing tools by computing the availability of restriction sites over the most probable Cas9 cleavage sites, which can be ideal for mutant screens. PhytoCRISP-Ex is a simple, fast and accurate web interface with 13 pre-indexed and presently updating phytoplankton genomes. The software was also designed as a UNIX-based standalone application that allows the user to search for target sequences in the genomes of a variety of other species.

  4. Recent advances in applying decision science to managing national forests

    USGS Publications Warehouse

    Marcot, Bruce G.; Thompson, Matthew P.; Runge, Michael C.; Thompson, Frank R.; McNulty, Steven; Cleaves, David; Tomosy, Monica; Fisher, Larry A.; Andrew, Bliss

    2012-01-01

    Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem analysis (defining alternatives, evaluating likely consequences, identifying key uncertainties, and analyzing tradeoffs), decision point (identifying the preferred alternative), and implementation and monitoring the preferred alternative with adaptive management feedbacks. We list a wide array of models, techniques, and tools available for each stage, and provide three case studies of their selected use in National Forest land management and project plans. Successful use of SDM involves participation by decision-makers, analysts, scientists, and stakeholders. We suggest specific areas for training and instituting SDM to foster transparency, rigor, clarity, and inclusiveness in formal decision processes regarding management of national forests.

  5. Climate Change Risk Management Consulting: The opportunity for an independent business practice

    NASA Astrophysics Data System (ADS)

    Ciccozzi, R.

    2009-04-01

    The Paper outlines the main questions to be addressed with reference to the actual demand of climate change risk management consulting, in the financial services. Moreover, the Project shall also try to investigate if the Catastrophe Modelling Industry can start and manage a business practice specialised on climate change risk exposures. In this context, the Paper aims at testing the possibility to build a sound business case, based upon typical MBA course analysis tools, such as PEST(LE), SWOT, etc. Specific references to the tools to be used and to other contribution from academic literature and general documentation are also discussed in the body of the Paper and listed at the end. The analysis shall also focus on the core competencies required for an independent climate change risk management consulting business practice, with the purpose to outline a valid definition of how to achieve competitive advantage in climate change risk management consulting.

  6. Rapid Development of Specialty Population Registries and Quality Measures from Electronic Health Record Data: An Agile Framework

    PubMed Central

    Kannan, V; Fish, JS; Mutz, JM; Carrington, AR; Lai, K; Davis, LS; Youngblood, JE; Rauschuber, MR; Flores, KA; Sara, EJ; Bhat, DG; Willett, DL

    2017-01-01

    Summary Background Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. Objective To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. Methods We adopted as guiding principles to (a) capture data as a by product of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed—either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM)—were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined “grains” from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week “sprints” for rapid-cycle feedback and refinement. Results Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. Conclusions This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. PMID:28930362

  7. Deep sub-wavelength metrology for advanced defect classification

    NASA Astrophysics Data System (ADS)

    van der Walle, P.; Kramer, E.; van der Donck, J. C. J.; Mulckhuyse, W.; Nijsten, L.; Bernal Arango, F. A.; de Jong, A.; van Zeijl, E.; Spruit, H. E. T.; van den Berg, J. H.; Nanda, G.; van Langen-Suurling, A. K.; Alkemade, P. F. A.; Pereira, S. F.; Maas, D. J.

    2017-06-01

    Particle defects are important contributors to yield loss in semi-conductor manufacturing. Particles need to be detected and characterized in order to determine and eliminate their root cause. We have conceived a process flow for advanced defect classification (ADC) that distinguishes three consecutive steps; detection, review and classification. For defect detection, TNO has developed the Rapid Nano (RN3) particle scanner, which illuminates the sample from nine azimuth angles. The RN3 is capable of detecting 42 nm Latex Sphere Equivalent (LSE) particles on XXX-flat Silicon wafers. For each sample, the lower detection limit (LDL) can be verified by an analysis of the speckle signal, which originates from the surface roughness of the substrate. In detection-mode (RN3.1), the signal from all illumination angles is added. In review-mode (RN3.9), the signals from all nine arms are recorded individually and analyzed in order to retrieve additional information on the shape and size of deep sub-wavelength defects. This paper presents experimental and modelling results on the extraction of shape information from the RN3.9 multi-azimuth signal such as aspect ratio, skewness, and orientation of test defects. Both modeling and experimental work confirm that the RN3.9 signal contains detailed defect shape information. After review by RN3.9, defects are coarsely classified, yielding a purified Defect-of-Interest (DoI) list for further analysis on slower metrology tools, such as SEM, AFM or HIM, that provide more detailed review data and further classification. Purifying the DoI list via optical metrology with RN3.9 will make inspection time on slower review tools more efficient.

  8. Population and habitat viability assessments for Golden-cheeked Warblers and Black-capped Vireos: Usefulness to Partners in Flight Conservation Planning

    USGS Publications Warehouse

    Beardmore, C.J.; Hatfield, J.S.; Bonney, Rick; Pashley, David N.; Cooper, Robert; Niles, Larry

    2000-01-01

    Golden-cheeked Warblers and Black-capped Vireos are Neotropical migratory birds that are federally listed as endangered. Recovery plans for both species advise the use of viability modeling as a tool for setting specific recovery and management targets. Population and Habitat Viability Assessment workshops were conducted to develop population targets and conservation recommendations for these species. Results of the workshops were based on modeling demographic and environmental factors, as well as discussions of management issues, management options, and public outreach strategies. The approach is intended to be iterative, and to be tracked by research and monitoring efforts. This paper discusses the consensus-building workshop process and how the approach could be useful to Partners in Flight. Population and Habitat Viability Assessments (PHVA) were used to develop population targets and conservation recommendations for Golden-cheeked Warblers (Dendroica chrysoparia) and Black-capped Vireos (Vireo atricapillus). This paper explains what PHVAs are, discusses how they are conducted, describes the general results that are produced, and suggests how Partners in Flight (PIF) might use a similar process for bird conservation planning. Detailed results of the assessments are not discussed here; however they can be found elsewhere (U. S. Fish and Wildlife Service 1996a, U. S. Fish and Wildlife Service 1996b). PHVAs were considered for Golden-cheeked Warblers and Black-capped Vireos because they are controversial, endangered species, and the species? recovery plans list PHVAs as tools to develop recovery recommendations. The U. S. Fish and Wildlife Service (USFWS) realized that the data needed to perform PHVAs for these species is limited, but that various conservation efforts, such as the Balcones Canyonlands Conservation Plan and other endeavors, were proceeding without benefit of the biological summarization and guidance that a PHVA could provide.

  9. [Association between productivity, list size, patient and practice characteristics in general practice].

    PubMed

    Olsen, Kim Rose; Sørensen, Torben Højmark; Gyrd-Hansen, Dorte

    2010-04-19

    Due to shortage of general practitioners, it may be necessary to improve productivity. We assess the association between productivity, list size and patient- and practice characteristics. A regression approach is used to perform productivity analysis based on national register data and survey data for 1,758 practices. Practices are divided into four groups according to list size and productivity. Statistical tests are used to assess differences in patient- and practice characteristics. There is a significant, positive correlation between list size and productivity (p < 0.01). Nevertheless, 19% of the practices have a list size below and a productivity above mean sample values. These practices have relatively demanding patients (older, low socioeconomic status, high use of pharmaceuticals) and they are frequently located in areas with limited access to specialized care and have a low use of assisting personnel. 13% of the practices have a list size above and a productivity below mean sample values. These practices have relatively less demanding patients, are located in areas with good access to specialized care, and have a high use of assisting personnel. Lists and practice characteristics have substantial influence on both productivity and list size. Adjusting list size to external factors seems to be an effective tool to increase productivity in general practice.

  10. Solid rocket booster performance evaluation model. Volume 4: Program listing

    NASA Technical Reports Server (NTRS)

    1974-01-01

    All subprograms or routines associated with the solid rocket booster performance evaluation model are indexed in this computer listing. An alphanumeric list of each routine in the index is provided in a table of contents.

  11. Calibration data Analysis Package (CAP): An IDL based widget application for analysis of X-ray calibration data

    NASA Astrophysics Data System (ADS)

    Vaishali, S.; Narendranath, S.; Sreekumar, P.

    An IDL (interactive data language) based widget application developed for the calibration of C1XS (Narendranath et al., 2010) instrument on Chandrayaan-1 is modified to provide a generic package for the analysis of data from x-ray detectors. The package supports files in ascii as well as FITS format. Data can be fitted with a list of inbuilt functions to derive the spectral redistribution function (SRF). We have incorporated functions such as `HYPERMET' (Philips & Marlow 1976) including non Gaussian components in the SRF such as low energy tail, low energy shelf and escape peak. In addition users can incorporate additional models which may be required to model detector specific features. Spectral fits use a routine `mpfit' which uses Leven-Marquardt least squares fitting method. The SRF derived from this tool can be fed into an accompanying program to generate a redistribution matrix file (RMF) compatible with the X-ray spectral analysis package XSPEC. The tool provides a user friendly interface of help to beginners and also provides transparency and advanced features for experts.

  12. The CMIP5 Model Documentation Questionnaire: Development of a Metadata Retrieval System for the METAFOR Common Information Model

    NASA Astrophysics Data System (ADS)

    Pascoe, Charlotte; Lawrence, Bryan; Moine, Marie-Pierre; Ford, Rupert; Devine, Gerry

    2010-05-01

    The EU METAFOR Project (http://metaforclimate.eu) has created a web-based model documentation questionnaire to collect metadata from the modelling groups that are running simulations in support of the Coupled Model Intercomparison Project - 5 (CMIP5). The CMIP5 model documentation questionnaire will retrieve information about the details of the models used, how the simulations were carried out, how the simulations conformed to the CMIP5 experiment requirements and details of the hardware used to perform the simulations. The metadata collected by the CMIP5 questionnaire will allow CMIP5 data to be compared in a scientifically meaningful way. This paper describes the life-cycle of the CMIP5 questionnaire development which starts with relatively unstructured input from domain specialists and ends with formal XML documents that comply with the METAFOR Common Information Model (CIM). Each development step is associated with a specific tool. (1) Mind maps are used to capture information requirements from domain experts and build a controlled vocabulary, (2) a python parser processes the XML files generated by the mind maps, (3) Django (python) is used to generate the dynamic structure and content of the web based questionnaire from processed xml and the METAFOR CIM, (4) Python parsers ensure that information entered into the CMIP5 questionnaire is output as CIM compliant xml, (5) CIM compliant output allows automatic information capture tools to harvest questionnaire content into databases such as the Earth System Grid (ESG) metadata catalogue. This paper will focus on how Django (python) and XML input files are used to generate the structure and content of the CMIP5 questionnaire. It will also address how the choice of development tools listed above provided a framework that enabled working scientists (who we would never ordinarily get to interact with UML and XML) to be part the iterative development process and ensure that the CMIP5 model documentation questionnaire reflects what scientists want to know about the models. Keywords: metadata, CMIP5, automatic information capture, tool development

  13. Clinical code set engineering for reusing EHR data for research: A review.

    PubMed

    Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels

    2017-06-01

    The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. O*NET[TM] Career Exploration Tools. Version 3.0.

    ERIC Educational Resources Information Center

    Employment and Training Administration (DOL), Washington, DC.

    Developed by the U.S. Department of Labor's Occupational Information Network (O*NET) team, the O*NET[TM] Career Exploration Tools (Version 3.0) consist of three main parts: (1) the Interest Profiler; (2) the Work Importance Locator; and (3) the O*NET[TM] Occupations Combined List. The Interest Profiler is a self-assessment career exploration tool…

  15. Family Myths, Beliefs, and Customs as a Research/Educational Tool to Explore Identity Formation

    ERIC Educational Resources Information Center

    Herman, William E.

    2008-01-01

    This paper outlines a qualitative research tool designed to explore personal identity formation as described by Erik Erikson and offers self-reflective and anonymous evaluative comments made by college students after completing this task. Subjects compiled a list of 200 myths, customs, fables, rituals, and beliefs from their family of origin and…

  16. The Practice Profile: An All Purpose Tool for Program Communication, Staff Development, Evaluation and Improvement.

    ERIC Educational Resources Information Center

    Loucks, Susan F.; Crandall, David P.

    The practice profile is a standardized, systematic, cost-effective tool for summarizing the components and requirements of a program in a manner that permits comparison with other programs or selection of discrete components from various programs. It provides a component checklist, a precise list of implementation requirements, and a system for…

  17. Evaluating Texts for Graphical Literacy Instruction: The Graphic Rating Tool

    ERIC Educational Resources Information Center

    Roberts, Kathryn L.; Brugar, Kristy A.; Norman, Rebecca R.

    2015-01-01

    In this article, we present the Graphical Rating Tool (GRT), which is designed to evaluate the graphical devices that are commonly found in content-area, non-fiction texts, in order to identify books that are well suited for teaching about those devices. We also present a "best of" list of science and social studies books, which includes…

  18. Enhancing Thematic Units Using the World Wide Web: Tools and Strategies for Students with Mild Disabilities.

    ERIC Educational Resources Information Center

    Gardner, J. Emmett; Wissick, Cheryl A.

    2002-01-01

    This article presents principles for using Web-based activities to support curriculum accommodations for students with mild disabilities. Tools, resources, and strategies are identified to help teachers construct meaningful and Web-enhanced thematic units. Web sites are listed in the areas of math, science, language arts, and social studies;…

  19. Characteristics of a Cognitive Tool That Helps Students Learn Diagnostic Problem Solving

    ERIC Educational Resources Information Center

    Danielson, Jared A.; Mills, Eric M.; Vermeer, Pamela J.; Preast, Vanessa A.; Young, Karen M.; Christopher, Mary M.; George, Jeanne W.; Wood, R. Darren; Bender, Holly S.

    2007-01-01

    Three related studies replicated and extended previous work (J.A. Danielson et al. (2003), "Educational Technology Research and Development," 51(3), 63-81) involving the Diagnostic Pathfinder (dP) (previously Problem List Generator [PLG]), a cognitive tool for learning diagnostic problem solving. In studies 1 and 2, groups of 126 and 113…

  20. Assessing Children for the Presence of a Disability. Resources You Can Use. NICHCY Bibliography. 2nd Edition.

    ERIC Educational Resources Information Center

    Gutierrez, Mary Kate, Comp.

    This resource list is intended to provide school systems with information on assessment of school-aged children for the presence of a disability. The 104 references are broken down into the following categories: general assessment information; assessment tools; critiques of assessment tools; curriculum-based assessment; assessments of different…

  1. Popular Music as a Learning Tool in the Social Studies.

    ERIC Educational Resources Information Center

    Litevich, John A., Jr.

    This teaching guide reflects the belief that popular music is an effective tool for teachers to use in presenting social studies lessons to students. Titles of songs representative of popular music from 1955 to 1982 are listed by subject matter and suggest a possible lesson to be used in teaching that particular issue. Subject areas listed…

  2. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  3. Successful Manipulation in Stable Marriage Model with Complete Preference Lists

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hirotatsu; Matsui, Tomomi

    This paper deals with a strategic issue in the stable marriage model with complete preference lists (i.e., a preference list of an agent is a permutation of all the members of the opposite sex). Given complete preference lists of n men over n women, and a marriage µ, we consider the problem for finding preference lists of n women over n men such that the men-proposing deferred acceptance algorithm (Gale-Shapley algorithm) adopted to the lists produces µ. We show a simple necessary and sufficient condition for the existence of a set of preference lists of women over men. Our condition directly gives an O(n2) time algorithm for finding a set of preference lists, if it exists.

  4. 77 FR 77038 - Procurement List; Proposed Additions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ... purpose is to provide interested persons an opportunity to submit comments on the proposed actions...: Keystone Vocational Services, Inc., Sharon, PA Contracting Activity: General Services Administration, Tools...

  5. A Scientific Collaboration Tool Built on the Facebook Platform

    PubMed Central

    Bedrick, Steven D.; Sittig, Dean F.

    2008-01-01

    We describe an application (“Medline Publications”) written for the Facebook platform that allows users to maintain and publish a list of their own Medline-indexed publications, as well as easily access their contacts’ lists. The system is semi-automatic in that it interfaces directly with the National Library of Medicine’s PubMed database to find and retrieve citation data. Furthermore, the system has the capability to present the user with sets of other users with similar publication profiles. As of July 2008, Medline Publications has attracted approximately 759 users, 624 of which have listed a total of 5,193 unique publications. PMID:18999247

  6. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  7. Authorship versus "credit" for participation in research: a case study of potential ethical dilemmas created by technical tools used by researchers and claims for authorship by their creators.

    PubMed

    Welker, James A; McCue, Jack D

    2007-01-01

    The distinction between authorship and other forms of credit for contribution to a publication has been a persisting controversy that has resulted in numerous guidelines outlining the expected contributions of those claiming authorship. While there have been flagrant, well-publicized deviations from widely accepted standards, they are largely outnumbered by cases that are not publicity-worthy, and therefore remain known to only those directly involved with the inappropriate conduct. We discuss the definition and ethical requirements of authorship, offer a case example of the authorship debate created by a technical tool at our institution, and review parallels that support and dispute the authorship claims of our software developers. Ultimately, we conclude that development of a technical tool that enables data collection does not adequately substitute for contributions to study design and manuscript preparation for authorship purposes. Unless the designers of such a technical tool prospectively participate as a part of the project, they would not have an adequate understanding of the publication's genesis to defend it publicly and cannot be listed as authors. Therefore, it is incumbent upon project members to invite tool developers to participate at the beginning of such projects, and for tool developers to contribute to study design and manuscript preparation when they desire authorship listings.

  8. A Breakdown, Application, and Evaluation of the Resiliency Analysis Support Tool (RAST) from the Operator’s Perspective

    DTIC Science & Technology

    2013-06-01

    67 ix LIST OF FIGURES Figure 1. Number of Natural Disasters Reported From...resources, and the number of data sets that RAST has acquired to quantify the scoring of Nepal . Figure 18 shows the Display Country Layers function. The...can see what the current infrastructure looks like for Nepal , specifically its Public Health System, we will switch to the List View to see a few

  9. Novel tool for deprescribing in chronic patients with multimorbidity: List of Evidence-Based Deprescribing for Chronic Patients criteria.

    PubMed

    Rodríguez-Pérez, Aitana; Alfaro-Lara, Eva Rocío; Albiñana-Perez, Sandra; Nieto-Martín, María Dolores; Díez-Manglano, Jesús; Pérez-Guerrero, Concepción; Santos-Ramos, Bernardo

    2017-11-01

    To create a tool to identify drugs and clinical situations that offers an opportunity of deprescribing in patients with multimorbidity. A literature review completed with electronic brainstorming, and subsequently, a panel of experts using the Delphi methodology were applied. The experts assessed the criteria identified in the literature and brainstorming as possible situations for deprescribing. They were also asked to assess the influence of life prognosis in each criterion. A tool was composed of the most appropriate criteria according to the strength of their evidence, usefulness in patients with multimorbidity and applicability in clinical practice. Out of a total of 100, 27 criteria were selected to be included in the final list. It was named the LESS-CHRON criteria (List of Evidence-baSed depreScribing for CHRONic patients), and was organized by the anatomical group of the Anatomical, Therapeutic, Chemical (ATC) classification system of the drug to be deprescribed. Each criterion contains: drug indication for which it is prescribed, clinical situation that offers an opportunity to deprescribe, clinical variable to be monitored and the minimum time to follow up the patient after deprescribing. The "LESS-CHRON criteria" are the result of a comprehensive and standardized methodology to identify clinical situations for deprescribing drugs in chronic patients with multimorbidity. Geriatr Gerontol Int 2017; 17: 2200-2207. © 2017 Japan Geriatrics Society.

  10. Rethinking the outpatient medication list: increasing patient activation and education while architecting for centralization and improved medication reconciliation.

    PubMed

    Pandolfe, Frank; Wright, Adam; Slack, Warner V; Safran, Charles

    2018-05-17

    Identify barriers impacting the time consuming and error fraught process of medication reconciliation. Design and implement an electronic medication management system where patient and trusted healthcare proxies can participate in establishing and maintaining an inclusive and up-to-date list of medications. A patient-facing electronic medication manager was deployed within an existing research project focused on elder care management funded by the AHRQ, InfoSAGE, allowing patients and patients' proxies the ability to build and maintain an accurate and up-to-date medication list. Free and open-source tools available from the U.S. government were used to embed the tenets of centralization, interoperability, data federation, and patient activation into the design. Using patient-centered design and free, open-source tools, we implemented a web and mobile enabled patient-facing medication manager for complex medication management. Patient and caregiver participation are essential to improve medication safety. Our medication manager is an early step towards a patient-facing medication manager that has been designed with data federation and interoperability in mind.

  11. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    2017-09-01

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.

  12. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  13. Children and adolescents' performance on a medium-length/nonsemantic word-list test.

    PubMed

    Flores-Lázaro, Julio César; Salgado Soruco, María Alejandra; Stepanov, Igor I

    2017-01-01

    Word-list learning tasks are among the most important and frequently used tests for declarative memory evaluation. For example, the California Verbal Learning Test-Children's Version (CVLT-C) and Rey Auditory Verbal Learning Test provide important information about different cognitive-neuropsychological processes. However, the impact of test length (i.e., number of words) and semantic organization (i.e., type of words) on children's and adolescents' memory performance remains to be clarified, especially during this developmental stage. To explore whether a medium-length non-semantically organized test can produce the typical curvilinear performance that semantically organized tests produce, reflecting executive control, we studied and compared the cognitive performance of normal children and adolescents by utilizing mathematical modeling. The model is based on the first-order system transfer function and has been successfully applied to learning curves for the CVLT-C (15 words, semantically organized paradigm). Results indicate that learning nine semantically unrelated words produces typical curvilinear (executive function) performance in children and younger adolescents and that performance could be effectively analyzed with the mathematical model. This indicates that the exponential increase (curvilinear performance) of correctly learned words does not solely depend on semantic and/or length features. This type of test controls semantic and length effects and may represent complementary tools for executive function evaluation in clinical populations in which semantic and/or length processing are affected.

  14. Obs4MIPS: Satellite Observations for Model Evaluation

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  15. Comparative study on DuPont analysis and DEA models for measuring stock performance using financial ratio

    NASA Astrophysics Data System (ADS)

    Arsad, Roslah; Shaari, Siti Nabilah Mohd; Isa, Zaidi

    2017-11-01

    Determining stock performance using financial ratio is challenging for many investors and researchers. Financial ratio can indicate the strengths and weaknesses of a company's stock performance. There are five categories of financial ratios namely liquidity, efficiency, leverage, profitability and market ratios. It is important to interpret the ratio correctly for proper financial decision making. The purpose of this study is to compare the performance of listed companies in Bursa Malaysia using Data Envelopment Analysis (DEA) and DuPont analysis Models. The study is conducted in 2015 involving 116 consumer products companies listed in Bursa Malaysia. The estimation method of Data Envelopment Analysis computes the efficiency scores and ranks the companies accordingly. The Alirezaee and Afsharian's method of analysis based Charnes, Cooper and Rhodes (CCR) where Constant Return to Scale (CRS) is employed. The DuPont analysis is a traditional tool for measuring the operating performance of companies. In this study, DuPont analysis is used to evaluate three different aspects such as profitability, efficiency of assets utilization and financial leverage. Return on Equity (ROE) is also calculated in DuPont analysis. This study finds that both analysis models provide different rankings of the selected samples. Hypothesis testing based on Pearson's correlation, indicates that there is no correlation between rankings produced by DEA and DuPont analysis. The DEA ranking model proposed by Alirezaee and Asharian is unstable. The method cannot provide complete ranking because the values of Balance Index is equal and zero.

  16. Program Evaluation Resources

    EPA Pesticide Factsheets

    These resources list tools to help you conduct evaluations, find organizations outside of EPA that are useful to evaluators, and find additional guides on how to do evaluations from organizations outside of EPA.

  17. Excel Spreadsheet Tools for Analyzing Groundwater Level Records and Displaying Information in ArcMap

    USGS Publications Warehouse

    Tillman, Fred D.

    2009-01-01

    When beginning hydrologic investigations, a first action is often to gather existing sources of well information, compile this information into a single dataset, and visualize this information in a geographic information system (GIS) environment. This report presents tools (macros) developed using Visual Basic for Applications (VBA) for Microsoft Excel 2007 to assist in these tasks. One tool combines multiple datasets into a single worksheet and formats the resulting data for use by the other tools. A second tool produces summary information about the dataset, such as a list of unique site identification numbers, the number of water-level observations for each, and a table of the number of sites with a listed number of water-level observations. A third tool creates subsets of the original dataset based on user-specified options and produces a worksheet with water-level information for each well in the subset, including the average and standard deviation of water-level observations and maximum decline and rise in water levels between any two observations, among other information. This water-level information worksheet can be imported directly into ESRI ArcMap as an 'XY Data' file, and each of the fields of summary well information can be used for custom display. A separate set of VBA tools distributed in an additional Excel workbook creates hydrograph charts of each of the wells in the data subset produced by the aforementioned tools and produces portable document format (PDF) versions of the hydrograph charts. These PDF hydrographs can be hyperlinked to well locations in ArcMap or other GIS applications.

  18. High Throughput Protein Quantitation using MRM Viewer Software and Dynamic MRM on a Triple Quadruple Mass Spectrometer

    PubMed Central

    Miller, C.; Waddell, K.; Tang, N.

    2010-01-01

    RP-122 Peptide quantitation using Multiple Reaction Monitoring (MRM) has been established as an important methodology for biomarker verification andvalidation.This requires high throughput combined with high sensitivity to analyze potentially thousands of target peptides in each sample.Dynamic MRM allows the system to only acquire the required MRMs of the peptide during a retention window corresponding to when each peptide is eluting. This reduces the number of concurrent MRM and therefore improves quantitation and sensitivity. MRM Selector allows the user to generate an MRM transition list with retention time information from discovery data obtained on a QTOF MS system.This list can be directly imported into the triple quadrupole acquisition software.However, situations can exist where a) the list of MRMs contain an excess of MRM transitions allowable under the ideal acquisition conditions chosen ( allowing for cycle time and chromatography conditions), or b) too many transitions in a certain retention time region which would result in an unacceptably low dwell time and cycle time.A new tool - MRM viewer has been developed to help users automatically generate multiple dynamic MRM methods from a single MRM list.In this study, a list of 3293 MRM transitions from a human plasma sample was compiled.A single dynamic MRM method with 3293 transitions results in a minimum dwell time of 2.18ms.Using MRM viewer we can generate three dynamic MRM methods with a minimum dwell time of 20ms which can give a better quality MRM quantitation.This tool facilitates both high throughput and high sensitivity for MRM quantitation.

  19. Potentially inappropriate medicines in elderly hospitalised patients according to the EU(7)-PIM list, STOPP version 2 criteria and comprehensive protocol.

    PubMed

    Mucalo, Iva; Hadžiabdić, Maja Ortner; Brajković, Andrea; Lukić, Sonja; Marić, Patricia; Marinović, Ivana; Bačić-Vrca, Vesna

    2017-08-01

    The aim of this study was to measure the prevalence of potentially inappropriate medications (PIMs) by using the EU(7)-PIM list, STOPP (Screening Tool of Older Persons' potentially inappropriate Prescriptions) version 2 criteria and the new comprehensive protocol. This prospective study involved a sample of 276 consecutive elderly patients discharged from the university teaching hospital. Age, gender, diagnoses, medication history and medicines at discharge were recorded. The main outcome measure was the prevalence of PIMs according to each set of criteria: EU(7)-PIM list, STOPP version 2 criteria and comprehensive protocol. The median patient age (range) was 74 (65-92) years. The median number of prescribed medications was 7 (1-17). STOPP identified 393 PIMs affecting 190 patients (69%), EU(7)-PIM list identified 330 PIMs in 184 patients (66.7%) whilst the comprehensive protocol identified 134 PIMs in 102 patients (37%). STOPP version 2 criteria identified significantly more PIMs per patient than the other two protocols (p < 0.001). Gender (p = 0.002), glomerular filtration rate (p = 0.039) and number of comorbidities (p = 0.001) were associated with the proportion of PIMs for the STOPP version 2 criteria only. A very high PIM prevalence at discharge was reported suggesting the urgent need for actions to reduce them. STOPP version 2 criteria identified significantly more PIMs than the EU(7)-PIM list and the comprehensive protocol and was found as a more sensitive tool for PIM detection.

  20. Extracting nursing practice patterns from structured labor and delivery data sets.

    PubMed

    Hall, Eric S; Thornton, Sidney N

    2007-10-11

    This study was designed to demonstrate the feasibility of a computerized care process model that provides real-time case profiling and outcome forecasting. A methodology was defined for extracting nursing practice patterns from structured point-of-care data collected using the labor and delivery information system at Intermountain Healthcare. Data collected during January 2006 were retrieved from Intermountain Healthcare's enterprise data warehouse for use in the study. The knowledge discovery in databases process provided a framework for data analysis including data selection, preprocessing, data-mining, and evaluation. Development of an interactive data-mining tool and construction of a data model for stratification of patient records into profiles supported the goals of the study. Five benefits of the practice pattern extraction capability, which extend to other clinical domains, are listed with supporting examples.

  1. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  2. Ultrasound waiting lists: rational queue or extended capacity?

    PubMed

    Brasted, Christopher

    2008-06-01

    The features and issues regarding clinical waiting lists in general and general ultrasound waiting lists in particular are reviewed, and operational aspects of providing a general ultrasound service are also discussed. A case study is presented describing a service improvement intervention in a UK NHS hospital's ultrasound department, from which arises requirements for a predictive planning model for an ultrasound waiting list. In the course of this, it becomes apparent that a booking system is a more appropriate way of describing the waiting list than a conventional queue. Distinctive features are identified from the literature and the case study as the basis for a predictive model, and a discrete event simulation model is presented which incorporates the distinctive features.

  3. The evolution analysis of listed companies co-holding non-listed financial companies based on two-mode heterogeneous networks

    NASA Astrophysics Data System (ADS)

    An, Pengli; Li, Huajiao; Zhou, Jinsheng; Chen, Fan

    2017-10-01

    Complex network theory is a widely used tool in the empirical research of financial markets. Two-mode and multi-mode networks are new trends and represent new directions in that they can more accurately simulate relationships between entities. In this paper, we use data for Chinese listed companies holding non-listed financial companies over a ten-year period to construct two networks: a two-mode primitive network in which listed companies and non-listed financial companies are considered actors and events, respectively, and a one-mode network that is constructed based on the decreasing-mode method in which listed companies are considered nodes. We analyze the evolution of the listed company co-holding network from several perspectives, including that of the whole network, of information control ability, of implicit relationships, of community division and of small-world characteristics. The results of the analysis indicate that (1) China's developing stock market affects the share-holding condition of listed companies holding non-listed financial companies; (2) the information control ability of co-holding networks is focused on a few listed companies and the implicit relationship of investment preference between listed companies is determined by the co-holding behavior; (3) the community division of the co-holding network is increasingly obvious, as determined by the investment preferences among listed companies; and (4) the small-world characteristics of the co-holding network are increasingly obvious, resulting in reduced communication costs. In this paper, we conduct an evolution analysis and develop an understanding of the factors that influence the listed companies co-holding network. This study will help illuminate research on evolution analysis.

  4. Social and Emotional Learning: A Resource Guide and New Approach to Measurement in ExpandED Schools. A TASC Resource Guide

    ERIC Educational Resources Information Center

    ExpandED Schools, 2014

    2014-01-01

    This guide is a list of tools that can be used in continued implementation of strong programming powered by Social and Emotional Learning (SEL) competencies. This curated resource pulls from across the landscape of policy, research and practice, with a description of each tool gathered directly from its website.

  5. The Photonovel: A Tool for Development. Program and Training Journal Manual Series Number 4.

    ERIC Educational Resources Information Center

    Weaks, Daniel

    Designed as a working and teaching tool for development workers, this manual includes the step-by-step process for preparing a photonovel. Chapter 1 introduces the photonovel, a blend of comic book and motion picture that substitutes photographs for the stylized drawings. It lists its advantages and disadvantages as an educational medium and tool…

  6. Final project memorandum: sea-level rise modeling handbook: resource guide for resource managers, engineers, and scientists

    USGS Publications Warehouse

    Doyle, Thomas W.

    2015-01-01

    Coastal wetlands of the Southeastern United States are undergoing retreat and migration from increasing tidal inundation and saltwater intrusion attributed to climate variability and sea-level rise. Much of the literature describing potential sea-level rise projections and modeling predictions are found in peer-reviewed academic journals or government technical reports largely suited to reading by other Ph.D. scientists who are more familiar or engaged in the climate change debate. Various sea-level rise and coastal wetland models have been developed and applied of different designs and scales of spatial and temporal complexity for predicting habitat and environmental change that have not heretofore been synthesized to aid natural resource managers of their utility and limitations. Training sessions were conducted with Federal land managers with U.S. Fish and Wildlife Service, National Park Service, and NOAA National Estuarine Research Reserves as well as state partners and nongovernmental organizations across the northern Gulf Coast from Florida to Texas to educate and to evaluate user needs and understanding of concepts, data, and modeling tools for projecting sea-level rise and its impact on coastal habitats and wildlife. As a result, this handbook was constructed from these training and feedback sessions with coastal managers and biologists of published decision-support tools and simulation models for sea-level rise and climate change assessments. A simplified tabular context was developed listing the various kinds of decision-support tools and ecological models along with criteria to distinguish the source, scale, and quality of information input and geographic data sets, physical and biological constraints and relationships, datum characteristics of water and land elevation components, utility options for setting sea-level rise and climate change scenarios, and ease or difficulty of storing, displaying, or interpreting model output. The handbook is designed to be a primer to understanding sea-level rise and a practical synthesis of the current state of knowledge and modeling tools as a resource guide for DOl land management needs and facilitating Landscape Conservation Cooperative (LCC) research and conservation initiatives.

  7. The Composites Institute`s FirstSource directory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-31

    This book is the gateway to the composites industry, containing pertinent phone numbers along with a glossary of terms. The glossary is a complete listing of current composites terminology and their definitions, from Ablative Plastic to Young`s Modulus. Contents include: (1) corporate index; (2) manufacturing processes; (3) materials suppliers; (4) markets--parts/products/components; (5) tooling; (6) processing equipment and supplies; (7) distributors/agents; (8) consulting, testing and other services; (9) geographical listing; and (10) glossary.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    IRIS is a search tool plug-in that is used to implement latent topic feedback for enhancing text navigation. It accepts a list of returned documents from an information retrieval wywtem that is generated from keyword search queries. Data is pulled directly from a topic information database and processed by IRIS to determine the most prominent and relevant topics, along with topic-ngrams, associated with the list of returned documents. User selected topics are then used to expand the query and presumabley refine the search results.

  9. Assessment of using digital manipulation tools for diagnosing mandibular radiolucent lesions

    PubMed Central

    Raitz, R; Assunção Junior, JNR; Fenyo-Pereira, M; Correa, L; de Lima, LP

    2012-01-01

    Objective The purpose of this study was to analyse the use of digital tools for image enhancement of mandibular radiolucent lesions and the effects of this manipulation on the percentage of correct radiographic diagnoses. Methods 24 panoramic radiographs exhibiting radiolucent lesions were selected, digitized and evaluated by non-experts (undergraduate and newly graduated practitioners) and by professional experts in oral diagnosis. The percentages of correct and incorrect diagnoses, according to the use of brightness/contrast, sharpness, inversion, highlight and zoom tools, were compared. All dental professionals made their evaluations without (T1) and with (T2) a list of radiographic diagnostic parameters. Results Digital tools were used with low frequency mainly in T2. The most preferred tool was sharpness (45.2%). In the expert group, the percentage of correct diagnoses did not change when any of the digital tools were used. For the non-expert group, there was an increase in the frequency of correct diagnoses when brightness/contrast was used in T2 (p=0.008) and when brightness/contrast and sharpness were not used in T1 (p=0.027). The use or non-use of brightness/contrast, zoom and sharpness showed moderate agreement in the group of experts [kappa agreement coefficient (κ)=0.514, 0.425 and 0.335, respectively]. For the non-expert group there was slight agreement for all the tools used (κ≤0.237). Conclusions Consulting the list of radiographic parameters before image manipulation reduced the frequency of tool use in both groups of examiners. Consulting the radiographic parameters with the use of some digital tools was important for improving correct diagnosis only in the group of non-expert examiners. PMID:22116126

  10. Assessment of using digital manipulation tools for diagnosing mandibular radiolucent lesions.

    PubMed

    Raitz, R; Assunção Junior, J N R; Fenyo-Pereira, M; Correa, L; de Lima, L P

    2012-03-01

    The purpose of this study was to analyse the use of digital tools for image enhancement of mandibular radiolucent lesions and the effects of this manipulation on the percentage of correct radiographic diagnoses. 24 panoramic radiographs exhibiting radiolucent lesions were selected, digitized and evaluated by non-experts (undergraduate and newly graduated practitioners) and by professional experts in oral diagnosis. The percentages of correct and incorrect diagnoses, according to the use of brightness/contrast, sharpness, inversion, highlight and zoom tools, were compared. All dental professionals made their evaluations without (T₁) and with (T₂) a list of radiographic diagnostic parameters. Digital tools were used with low frequency mainly in T₂. The most preferred tool was sharpness (45.2%). In the expert group, the percentage of correct diagnoses did not change when any of the digital tools were used. For the non-expert group, there was an increase in the frequency of correct diagnoses when brightness/contrast was used in T₂ (p=0.008) and when brightness/contrast and sharpness were not used in T₁ (p=0.027). The use or non-use of brightness/contrast, zoom and sharpness showed moderate agreement in the group of experts [kappa agreement coefficient (κ) = 0.514, 0.425 and 0.335, respectively]. For the non-expert group there was slight agreement for all the tools used (κ ≤ 0.237). Consulting the list of radiographic parameters before image manipulation reduced the frequency of tool use in both groups of examiners. Consulting the radiographic parameters with the use of some digital tools was important for improving correct diagnosis only in the group of non-expert examiners.

  11. NASA Thesaurus. Volumes 1 and 2; Hierarchical Listing with Definitions; Rotated Term Display

    NASA Technical Reports Server (NTRS)

    2012-01-01

    The NASA Thesaurus contains the authorized subject terms by which the documents in the NASA STI Databases are indexed and retrieved. The scope of this controlled vocabulary includes not only aerospace engineering, but all supporting areas of engineering and physics, the natural space sciences (astronomy, astrophysics, planetary science), Earth sciences, and to some extent, the biological sciences. Volume 1 - Hierarchical Listing With Definitions contains over 18,400 subject terms, 4,300 definitions, and more than 4,500 USE cross references. The Hierarchical Listing presents full hierarchical structure for each term along with 'related term' lists, and can serve as an orthographic authority. Volume 2 - Rotated Term Display is a ready-reference tool which provides over 52,700 additional 'access points' to the thesaurus terminology. It contains the postable and nonpostable terms found in the Hierarchical Listing arranged in a KWIC (key-word-in-context) index. This CD-ROM version of the NASA Thesaurus is in PDF format and is updated to the current year of purchase.

  12. Phylogenetically-informed priorities for amphibian conservation.

    PubMed

    Isaac, Nick J B; Redding, David W; Meredith, Helen M; Safi, Kamran

    2012-01-01

    The amphibian decline and extinction crisis demands urgent action to prevent further large numbers of species extinctions. Lists of priority species for conservation, based on a combination of species' threat status and unique contribution to phylogenetic diversity, are one tool for the direction and catalyzation of conservation action. We describe the construction of a near-complete species-level phylogeny of 5713 amphibian species, which we use to create a list of evolutionarily distinct and globally endangered species (EDGE list) for the entire class Amphibia. We present sensitivity analyses to test the robustness of our priority list to uncertainty in species' phylogenetic position and threat status. We find that both sources of uncertainty have only minor impacts on our 'top 100' list of priority species, indicating the robustness of the approach. By contrast, our analyses suggest that a large number of Data Deficient species are likely to be high priorities for conservation action from the perspective of their contribution to the evolutionary history.

  13. Expert consensus for performing right heart catheterisation for suspected pulmonary arterial hypertension in systemic sclerosis: a Delphi consensus study with cluster analysis.

    PubMed

    Avouac, Jérôme; Huscher, Dörte; Furst, Daniel E; Opitz, Christian F; Distler, Oliver; Allanore, Yannick

    2014-01-01

    To establish an expert consensus on which criteria are the most appropriate in clinical practice to refer patients with systemic sclerosis (SSc) for right heart catheterisation (RHC) when pulmonary hypertension (PH) is suspected. A three stage internet based Delphi consensus exercise involving worldwide PH experts was designed. In the first stage, a comprehensive list of domains and items combining evidence based indications and expert opinions were obtained. In the second and third stages, experts were asked to rate each item selected in the list. After each of stages 2 and 3, the number of items and criteria were reduced according to a cluster analysis. A literature search and the opinions of 47 experts participating in Delphi stage 1 provided a list of seven domains containing 142 criteria. After stages 2 and 3, these domains and tools were reduced to three domains containing eight tools: clinical (progressive dyspnoea over the past 3 months, unexplained dyspnoea, worsening of WHO dyspnoea functional class, any finding on physical examination suggestive of elevated right heart pressures and any sign of right heart failure), echocardiography (systolic pulmonary artery pressure >45 mm Hg and right ventricle dilation) and pulmonary function tests (diffusion lung capacity for carbon monoxide <50% without pulmonary fibrosis). Among experts in pulmonary arterial hypertension-SSc, a core set of criteria for clinical practice to refer SSc patients for RHC has been defined by Delphi consensus methods. Although these indications are recommended by this expert group to be used as an interim tool, it will be necessary to formally validate the present tools in further studies.

  14. Insights: Tools of the Trade.

    ERIC Educational Resources Information Center

    Bruno, Michael J.

    1988-01-01

    Describes a demonstration showing the chemical reversibility between the chromate and dichromate ions. Includes reaction equations and listing of equipment needed. Recommends a demonstration for illustrating Le Chatelier's principle and stoichiometric relationships. (ML)

  15. Two Rival Conceptions of Vocational Education: Adam Smith and Friedrich List.

    ERIC Educational Resources Information Center

    Winch, Christopher

    1998-01-01

    Examines and discusses two views of political economy: (1) the classical model of Adam Smith; and (2) the social capitalist model associated with Friedrich List. Explores two varieties of vocational education and training that emerge from a comparison of Smith's and List's ideas. (CMK)

  16. The primacy model: a new model of immediate serial recall.

    PubMed

    Page, M P; Norris, D

    1998-10-01

    A new model of immediate serial recall is presented: the primacy model. The primacy model stores order information by means of the assumption that the strength of activation of successive list items decreases across list position to form a primacy gradient. Ordered recall is supported by a repeated cycle of operations involving a noisy choice of the most active item followed by suppression of the chosen item. Word-length and list-length effects are attributed to a decay process that occurs both during input, when effective rehearsal is prevented, and during output. The phonological similarity effect is attributed to a second stage of processing at which phonological confusions occur. The primacy model produces accurate simulations of the effects of word length, list length, and phonological similarity.

  17. Union List Development: Control of the Serial Literature *

    PubMed Central

    Sawyers, Elizabeth J.

    1972-01-01

    The discussion covers the development of a national union list or finding tool for biomedical serial holdings and its integration into the National Serials Data Program, which is being developed under the auspices of the three National Libraries. Specific topics which are covered include: (1) Selection of the Union Catalog of Medical Periodicals (UCMP) as the basis for a biomedical list and the status of that activity; (2) discussion of the various methods of recording holdings; (3) status of the National Serials Data Program and a discussion of its relationship to the UCMP file; and (4) status of the Standard Serial Number and its relationship to other existing coding schemes for serial titles. PMID:5054307

  18. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments

    PubMed Central

    MacLean, Brendan; Tomazela, Daniela M.; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L.; Frewen, Barbara; Kern, Randall; Tabb, David L.; Liebler, Daniel C.; MacCoss, Michael J.

    2010-01-01

    Summary: Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Availability: Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project. Contact: brendanx@u.washington.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20147306

  19. Diversity in emergency medicine education: expanding the horizon.

    PubMed

    Bowman, Steven H; Moreno-Walton, Lisa; Ezenkwele, Ugo A; Heron, Sheryl L

    2011-10-01

    An emergency medicine (EM)-based curriculum on diversity, inclusion, and cultural competency can also serve as a mechanism to introduce topics on health care disparities. Although the objectives of such curricula and the potential benefits to EM trainees are apparent, there are relatively few resources available for EM program directors to use to develop these specialized curricula. The object of this article is to 1) broadly discuss the current state of curricula of diversity, inclusion, and cultural competency in EM training programs; 2) identify tools and disseminate strategies to embed issues of disparities in health care in the creation of the curriculum; and 3) provide resources for program directors to develop their own curricula. A group of EM program directors with an interest in cultural competency distributed a preworkshop survey through the Council of Emergency Medicine Residency Directors (CORD) e-mail list to EM program directors to assess the current state of diversity and cultural competency training in EM programs. Approximately 50 members attended a workshop during the 2011 CORD Academic Assembly as part of the Best Practices track, where the results of the survey were disseminated and discussed. In addition to the objectives listed above, the presenters reviewed the literature regarding the rationale for a cultural competency curriculum and its relationship to addressing health care disparities, the relationship to unconscious physician bias, and the Tool for Assessing Cultural Competence Training (TACCT) model for curriculum development. © 2011 by the Society for Academic Emergency Medicine.

  20. Multi-criteria development and incorporation into decision tools for health technology adoption.

    PubMed

    Poulin, Paule; Austen, Lea; Scott, Catherine M; Waddell, Cameron D; Dixon, Elijah; Poulin, Michelle; Lafrenière, René

    2013-01-01

    When introducing new health technologies, decision makers must integrate research evidence with local operational management information to guide decisions about whether and under what conditions the technology will be used. Multi-criteria decision analysis can support the adoption or prioritization of health interventions by using criteria to explicitly articulate the health organization's needs, limitations, and values in addition to evaluating evidence for safety and effectiveness. This paper seeks to describe the development of a framework to create agreed-upon criteria and decision tools to enhance a pre-existing local health technology assessment (HTA) decision support program. The authors compiled a list of published criteria from the literature, consulted with experts to refine the criteria list, and used a modified Delphi process with a group of key stakeholders to review, modify, and validate each criterion. In a workshop setting, the criteria were used to create decision tools. A set of user-validated criteria for new health technology evaluation and adoption was developed and integrated into the local HTA decision support program. Technology evaluation and decision guideline tools were created using these criteria to ensure that the decision process is systematic, consistent, and transparent. This framework can be used by others to develop decision-making criteria and tools to enhance similar technology adoption programs. The development of clear, user-validated criteria for evaluating new technologies adds a critical element to improve decision-making on technology adoption, and the decision tools ensure consistency, transparency, and real-world relevance.

  1. 49 CFR 1242.28 - Roadway machines, small tools and supplies, and snow removal (accounts XX-19-36 to XX-19-38...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... snow removal (accounts XX-19-36 to XX-19-38, inclusive). 1242.28 Section 1242.28 Transportation Other... tools and supplies, and snow removal (accounts XX-19-36 to XX-19-38, inclusive). Separate common expenses according to distribution of common expenses listed in § 1242.10, Administration—Track (account XX...

  2. Tapir: A web interface for transit/eclipse observability

    NASA Astrophysics Data System (ADS)

    Jensen, Eric

    2013-06-01

    Tapir is a set of tools, written in Perl, that provides a web interface for showing the observability of periodic astronomical events, such as exoplanet transits or eclipsing binaries. The package provides tools for creating finding charts for each target and airmass plots for each event. The code can access target lists that are stored on-line in a Google spreadsheet or in a local text file.

  3. Higher Skills. A Case Study of the Role of Further Education Colleges in Meeting the Training Needs of the Small Plant and Tool Hire Industry.

    ERIC Educational Resources Information Center

    Further Education Unit, London (England).

    A British project explored ways further education colleges could help meet training needs of small businesses, specifically the small plant and tool hire industry. The industry's leading organization, the Hire Association of Europe (HAE), provided a list of members; responsibility for making contact rested with the colleges. The most effective…

  4. MannDB: A microbial annotation database for protein characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, C; Lam, M; Smith, J

    2006-05-19

    MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-sourcemore » tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins) are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. MannDB comprises a large number of genomes and comprehensive protein sequence analyses representing organisms listed as high-priority agents on the websites of several governmental organizations concerned with bio-terrorism. MannDB provides the user with a BLAST interface for comparison of native and non-native sequences and a query tool for conveniently selecting proteins of interest. In addition, the user has access to a web-based browser that compiles comprehensive and extensive reports.« less

  5. [SCREENING OF NUTRITIONAL STATUS AMONG ELDERLY PEOPLE AT FAMILY MEDICINE].

    PubMed

    Račić, M; Ivković, N; Kusmuk, S

    2015-11-01

    The prevalence of malnutrition in elderly is high. Malnutrition or risk of malnutrition can be detected by use of nutritional screening or assessment tools. This systematic review aimed to identify tools that would be reliable, valid, sensitive and specific for nutritional status screening in patients older than 65 at family medicine. The review was performed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Studies were retrieved using MEDLINE (via Ovid), PubMed and Cochrane Library electronic databases and by manual searching of relevant articles listed in reference list of key publications. The electronic databases were searched using defined key words adapted to each database and using MESH terms. Manual revision of reviews and original articles was performed using Electronic Journals Library. Included studies involved development and validation of screening tools in the community-dwelling elderly population. The tools, subjected to validity and reliability testing for use in the community-dwelling elderly population were Mini Nutritional Assessment (MNA), Mini Nutritional Assessment-Short Form (MNA-SF), Nutrition Screening Initiative (NSI), which includes DETERMINE list, Level I and II Screen, Seniors in the Community: Risk Evaluation for Eating, and Nutrition (SCREEN I and SCREEN II), Subjective Global Assessment (SGA), Nutritional Risk Index (NRI), and Malaysian and South African tool. MNA and MNA-SF appear to have highest reliability and validity for screening of community-dwelling elderly, while the reliability and validity of SCREEN II are good. The authors conclude that whilst several tools have been developed, most have not undergone extensive testing to demonstrate their ability to identify nutritional risk. MNA and MNA-SF have the highest reliability and validity for screening of nutritional status in the community-dwelling elderly, and the reliability and validity of SCREEN II are satisfactory. These instruments also contain all three nutritional status indicators and are practical for use in family medicine. However, the gold standard for screening cannot be set because testing of reliability and continuous validation in the study with a higher level of evidence need to be conducted in family medicine.

  6. Unambiguous metabolite identification in high-throughput metabolomics by hybrid 1D 1 H NMR/ESI MS 1 approach: Hybrid 1D 1 H NMR/ESI MS 1 metabolomics method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Lawrence R.; Hoyt, David W.; Walker, S. Michael

    We present a novel approach to improve accuracy of metabolite identification by combining direct infusion ESI MS1 with 1D 1H NMR spectroscopy. The new approach first applies standard 1D 1H NMR metabolite identification protocol by matching the chemical shift, J-coupling and intensity information of experimental NMR signals against the NMR signals of standard metabolites in metabolomics library. This generates a list of candidate metabolites. The list contains false positive and ambiguous identifications. Next, we constrained the list with the chemical formulas derived from high-resolution direct infusion ESI MS1 spectrum of the same sample. Detection of the signals of a metabolitemore » both in NMR and MS significantly improves the confidence of identification and eliminates false positive identification. 1D 1H NMR and direct infusion ESI MS1 spectra of a sample can be acquired in parallel in several minutes. This is highly beneficial for rapid and accurate screening of hundreds of samples in high-throughput metabolomics studies. In order to make this approach practical, we developed a software tool, which is integrated to Chenomx NMR Suite. The approach is demonstrated on a model mixture, tomato and Arabidopsis thaliana metabolite extracts, and human urine.« less

  7. Aquarius Project: Research in the System Architecture of Accelerators for the High Performance Execution of Logic Programs.

    DTIC Science & Technology

    1991-05-31

    benchmarks ............ .... . .. .. . . .. 220 Appendix G : Source code of the Aquarius Prolog compiler ........ . 224 Chapter I Introduction "You’re given...notation, a tool that is used throughout the compiler’s implementation. Appendix F lists the source code of the C and Prolog benchmarks. Appendix G lists the...source code of the compilcr. 5 "- standard form Prolog / a-sfomadon / head umrvln Convert to tmeikernel Prol g vrans~fonaon 1symbolic execution

  8. Smoke Ready Toolbox for Wildfires

    EPA Pesticide Factsheets

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  9. The Sea Urchin Embryo: A Remarkable Classroom Tool.

    ERIC Educational Resources Information Center

    Oppenheimer, Steven B.

    1989-01-01

    Discussed are the uses of sea urchins in research and their usefulness and advantages in the classroom investigation of embryology. Ideas for classroom activities and student research are presented. Lists 25 references. (CW)

  10. ITS/operations resource guide 2007

    DOT National Transportation Integrated Search

    2007-01-01

    The U.S. Department of Transportations (U.S. DOTs) ITS/Operations Resource Guide 2007 is a comprehensive listing of over 400 documents, videos, websites, training courses, software tools, and points of contact related to intelligent transportat...

  11. New support vector machine-based method for microRNA target prediction.

    PubMed

    Li, L; Gao, Q; Mao, X; Cao, Y

    2014-06-09

    MicroRNA (miRNA) plays important roles in cell differentiation, proliferation, growth, mobility, and apoptosis. An accurate list of precise target genes is necessary in order to fully understand the importance of miRNAs in animal development and disease. Several computational methods have been proposed for miRNA target-gene identification. However, these methods still have limitations with respect to their sensitivity and accuracy. Thus, we developed a new miRNA target-prediction method based on the support vector machine (SVM) model. The model supplies information of two binding sites (primary and secondary) for a radial basis function kernel as a similarity measure for SVM features. The information is categorized based on structural, thermodynamic, and sequence conservation. Using high-confidence datasets selected from public miRNA target databases, we obtained a human miRNA target SVM classifier model with high performance and provided an efficient tool for human miRNA target gene identification. Experiments have shown that our method is a reliable tool for miRNA target-gene prediction, and a successful application of an SVM classifier. Compared with other methods, the method proposed here improves the sensitivity and accuracy of miRNA prediction. Its performance can be further improved by providing more training examples.

  12. The challenges of transitioning from linear to high-order overlay control in advanced lithography

    NASA Astrophysics Data System (ADS)

    Adel, M.; Izikson, P.; Tien, D.; Huang, C. K.; Robinson, J. C.; Eichelberger, B.

    2008-03-01

    In the lithography section of the ITRS 2006 update, at the top of the list of difficult challenges appears the text "overlay of multiple exposures including mask image placement". This is a reflection of the fact that today overlay is becoming a major yield risk factor in semiconductor manufacturing. Historically, lithographers have achieved sufficient alignment accuracy and hence layer to layer overlay control by relying on models which define overlay as a linear function of the field and wafer coordinates. These linear terms were easily translated to correctibles in the available exposure tool degrees of freedom on the wafer and reticle stages. However, as the 45 nm half pitch node reaches production, exposure tool vendors have begun to make available, and lithographers have begun to utilize so called high order wafer and field control, in which either look up table or high order polynomial models are modified on a product by product basis. In this paper, the major challenges of this transition will be described. It will include characterization of the sources of variation which need to be controlled by these new models and the overlay and alignment sampling optimization problem which needs to be addressed, while maintaining the ever tightening demands on productivity and cost of ownership.

  13. Database resources for the Tuberculosis community

    PubMed Central

    Lew, Jocelyne M.; Mao, Chunhong; Shukla, Maulik; Warren, Andrew; Will, Rebecca; Kuznetsov, Dmitry; Xenarios, Ioannis; Robertson, Brian D.; Gordon, Stephen V.; Schnappinger, Dirk; Cole, Stewart T.; Sobral, Bruno

    2013-01-01

    Summary Access to online repositories for genomic and associated “-omics” datasets is now an essential part of everyday research activity. It is important therefore that the Tuberculosis community is aware of the databases and tools available to them online, as well as for the database hosts to know what the needs of the research community are. One of the goals of the Tuberculosis Annotation Jamboree, held in Washington DC on March 7th–8th 2012, was therefore to provide an overview of the current status of three key Tuberculosis resources, TubercuList (tuberculist.epfl.ch), TB Database (www.tbdb.org), and Pathosystems Resource Integration Center (PATRIC, www.patricbrc.org). Here we summarize some key updates and upcoming features in TubercuList, and provide an overview of the PATRIC site and its online tools for pathogen RNA-Seq analysis. PMID:23332401

  14. Different hip and knee priority score systems: are they good for the same thing?

    PubMed

    Escobar, Antonio; Quintana, Jose Maria; Espallargues, Mireia; Allepuz, Alejandro; Ibañez, Berta

    2010-10-01

    The aim of the present study was to compare two priority tools used for joint replacement for patients on waiting lists, which use two different methods. Two prioritization tools developed and validated by different methodologies were used on the same cohort of patients. The first, an IRYSS hip and knee priority score (IHKPS) developed by RAND method, was applied while patients were on the waiting list. The other, a Catalonia hip-knee priority score (CHKPS) developed by conjoint analysis, was adapted and applied retrospectively. In addition, all patients fulfilled pre-intervention the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC). Correlation between them was studied by Pearson correlation coefficient (r). Agreement was analysed by means of intra-class correlation coefficient (ICC), Kendall coefficient and Cohern kappa. The relationship between IHKPS, CHKPS and baseline WOMAC scores by r coefficient was studied. The sample consisted of 774 consecutive patients. Pearson correlation coefficient between IHKPS and CHKPS was 0.79. The agreement study showed that ICC was 0.74, Kendall coefficient 0.86 and kappa 0.66. Finally, correlation between CHKPS and baseline WOMAC ranged from 0.43 to 0.64. The results according to the relationship between IHKPS and WOMAC ranged from 0.50 to 0.74. Results support the hypothesis that if the final objective of the prioritization tools is to organize and sort patients on the waiting list, although they use different methodologies, the results are similar. © 2010 Blackwell Publishing Ltd.

  15. Optimising the selection of food items for FFQs using Mixed Integer Linear Programming.

    PubMed

    Gerdessen, Johanna C; Souverein, Olga W; van 't Veer, Pieter; de Vries, Jeanne Hm

    2015-01-01

    To support the selection of food items for FFQs in such a way that the amount of information on all relevant nutrients is maximised while the food list is as short as possible. Selection of the most informative food items to be included in FFQs was modelled as a Mixed Integer Linear Programming (MILP) model. The methodology was demonstrated for an FFQ with interest in energy, total protein, total fat, saturated fat, monounsaturated fat, polyunsaturated fat, total carbohydrates, mono- and disaccharides, dietary fibre and potassium. The food lists generated by the MILP model have good performance in terms of length, coverage and R 2 (explained variance) of all nutrients. MILP-generated food lists were 32-40 % shorter than a benchmark food list, whereas their quality in terms of R 2 was similar to that of the benchmark. The results suggest that the MILP model makes the selection process faster, more standardised and transparent, and is especially helpful in coping with multiple nutrients. The complexity of the method does not increase with increasing number of nutrients. The generated food lists appear either shorter or provide more information than a food list generated without the MILP model.

  16. NASA Remote Sensing Observations for Water Resource and Infrastructure Management

    NASA Astrophysics Data System (ADS)

    Granger, S. L.; Armstrong, L.; Farr, T.; Geller, G.; Heath, E.; Hyon, J.; Lavoie, S.; McDonald, K.; Realmuto, V.; Stough, T.; Szana, K.

    2008-12-01

    Decision support tools employed by water resource and infrastructure managers often utilize data products obtained from local sources or national/regional databases of historic surveys and observations. Incorporation of data from these sources can be laborious and time consuming as new products must be identified, cleaned and archived for each new study site. Adding remote sensing observations to the list of sources holds promise for a timely, consistent, global product to aid decision support at regional and global scales by providing global observations of geophysical parameters including soil moisture, precipitation, atmospheric temperature, derived evapotranspiration, and snow extent needed for hydrologic models and decision support tools. However, issues such as spatial and temporal resolution arise when attempting to integrate remote sensing observations into existing decision support tools. We are working to overcome these and other challenges through partnerships with water resource managers, tool developers and other stakeholders. We are developing a new data processing framework, enabled by a core GIS server, to seamlessly pull together observations from disparate sources for synthesis into information products and visualizations useful to the water resources community. A case study approach is being taken to develop the system by working closely with water infrastructure and resource managers to integrate remote observations into infrastructure, hydrologic and water resource decision tools. We present the results of a case study utilizing observations from the PALS aircraft instrument as a proxy for NASA's upcoming Soil Moisture Active Passive (SMAP) mission and an existing commercial decision support tool.

  17. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  18. Dependency visualization for complex system understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, J. Allison Cory

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less

  19. Pediatric post-thrombotic syndrome in children: Toward the development of a new diagnostic and evaluative measurement tool.

    PubMed

    Avila, M L; Brandão, L R; Williams, S; Ward, L C; Montoya, M I; Stinson, J; Kiss, A; Lara-Corrales, I; Feldman, B M

    2016-08-01

    Our goal was to conduct the item generation and piloting phases of a new discriminative and evaluative tool for pediatric post-thrombotic syndrome. We followed a formative model for the development of the tool, focusing on the signs/symptoms (items) that define post-thrombotic syndrome. For item generation, pediatric thrombosis experts and subjects diagnosed with extremity post-thrombotic syndrome during childhood nominated items. In the piloting phase, items were cross-sectionally measured in children with limb deep vein thrombosis to examine item performance. Twenty-three experts and 16 subjects listed 34 items, which were then measured in 140 subjects with previous diagnosis of limb deep vein thrombosis (70 upper extremity and 70 lower extremity). The items with strongest correlation with post-thrombotic syndrome severity and largest area under the curve were pain (in older children), paresthesia, and swollen limb for the upper extremity group, and pain (in older children), tired limb, heaviness, tightness and paresthesia for the lower extremity group. The diagnostic properties of the items and their correlations with post-thrombotic syndrome severity varied according to the assessed venous territory. The information gathered in this study will help experts decide which item should be considered for inclusion in the new tool. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Charting taxonomic knowledge through ontologies and ranking algorithms

    NASA Astrophysics Data System (ADS)

    Huber, Robert; Klump, Jens

    2009-04-01

    Since the inception of geology as a modern science, paleontologists have described a large number of fossil species. This makes fossilized organisms an important tool in the study of stratigraphy and past environments. Since taxonomic classifications of organisms, and thereby their names, change frequently, the correct application of this tool requires taxonomic expertise in finding correct synonyms for a given species name. Much of this taxonomic information has already been published in journals and books where it is compiled in carefully prepared synonymy lists. Because this information is scattered throughout the paleontological literature, it is difficult to find and sometimes not accessible. Also, taxonomic information in the literature is often difficult to interpret for non-taxonomists looking for taxonomic synonymies as part of their research. The highly formalized structure makes Open Nomenclature synonymy lists ideally suited for computer aided identification of taxonomic synonyms. Because a synonymy list is a list of citations related to a taxon name, its bibliographic nature allows the application of bibliometric techniques to calculate the impact of synonymies and taxonomic concepts. TaxonRank is a ranking algorithm based on bibliometric analysis and Internet page ranking algorithms. TaxonRank uses published synonymy list data stored in TaxonConcept, a taxonomic information system. The basic ranking algorithm has been modified to include a measure of confidence on species identification based on the Open Nomenclature notation used in synonymy list, as well as other synonymy specific criteria. The results of our experiments show that the output of the proposed ranking algorithm gives a good estimate of the impact a published taxonomic concept has on the taxonomic opinions in the geological community. Also, our results show that treating taxonomic synonymies as part of on an ontology is a way to record and manage taxonomic knowledge, and thus contribute to the preservation our scientific heritage.

  1. Acceptability of an open-label wait-listed trial design: Experiences from the PROUD PrEP study.

    PubMed

    Gafos, Mitzy; Brodnicki, Elizabeth; Desai, Monica; McCormack, Sheena; Nutland, Will; Wayal, Sonali; White, Ellen; Wood, Gemma; Barber, Tristan; Bell, Gill; Clarke, Amanda; Dolling, David; Dunn, David; Fox, Julie; Haddow, Lewis; Lacey, Charles; Nardone, Anthony; Quinn, Killian; Rae, Caroline; Reeves, Iain; Rayment, Michael; White, David; Apea, Vanessa; Ayap, Wilbert; Dewsnap, Claire; Collaco-Moraes, Yolanda; Schembri, Gabriel; Sowunmi, Yinka; Horne, Rob

    2017-01-01

    PROUD participants were randomly assigned to receive pre-exposure prophylaxis (PrEP) immediately or after a deferred period of one-year. We report on the acceptability of this open-label wait-listed trial design. Participants completed an acceptability questionnaire, which included categorical study acceptability data and free-text data on most and least liked aspects of the study. We also conducted in-depth interviews (IDI) with a purposely selected sub-sample of participants. Acceptability questionnaires were completed by 76% (415/544) of participants. After controlling for age, immediate-group participants were almost twice as likely as deferred-group participants to complete the questionnaire (AOR:1.86;95%CI:1.24,2.81). In quantitative data, the majority of participants in both groups found the wait-listed design acceptable when measured by satisfaction of joining the study, intention to remain in the study, and interest in joining a subsequent study. However, three-quarters thought that the chance of being in the deferred-group might put other volunteers off joining the study. In free-text responses, data collection tools were the most frequently reported least liked aspect of the study. A fifth of deferred participants reported 'being deferred' as the thing they least liked about the study. However, more deferred participants disliked the data collection tools than the fact that they had to wait a year to access PrEP. Participants in the IDIs had a good understanding of the rationale for the open-label wait-listed study design. Most accepted the design but acknowledged they were, or would have been, disappointed to be randomised to the deferred group. Five of the 25 participants interviewed reported some objection to the wait-listed design. The quantitative and qualitative findings suggest that in an environment where PrEP was not available, the rationale for the wait-listed trial design was well understood and generally acceptable to most participants in this study.

  2. Towards a Collaborative Filtering Approach to Medication Reconciliation

    PubMed Central

    Hasan, Sharique; Duncan, George T.; Neill, Daniel B.; Padman, Rema

    2008-01-01

    A physician’s prescribing decisions depend on knowledge of the patient’s medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patient’s medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patient’s medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation. PMID:18998834

  3. Towards a collaborative filtering approach to medication reconciliation.

    PubMed

    Hasan, Sharique; Duncan, George T; Neill, Daniel B; Padman, Rema

    2008-11-06

    A physicians prescribing decisions depend on knowledge of the patients medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patients medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patients medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation.

  4. The Secret List of Dos and Don'ts for Filmmaking

    NASA Astrophysics Data System (ADS)

    Kramer, N.

    2012-12-01

    Science is a massive black box to billions of people who walk the streets. However, the process of filmmaking can be equally as mystifying. As with the development of many scientific experiments, the process starts on a napkin at a restaurant…but then what? The road to scientific publication is propelled by a canonical list of several dos and don't that fit most situations. An equally useful list exists for up-and-coming producers. The list streamlines efforts, optimizes your use of the tools at your fingertips and enhances impact. Many fundamentals can be learned from books, but during this talk we will project and discuss several examples of best practices, from honing a story, to identifying audience appeal, filming, editing and the secrets of inexpensively acquiring expert help. Whether your goal is a two-minute webisode or a 90 minute documentary, these time-tested practices, with a little awareness, can give life to your films.;

  5. Impact of maternal education about complementary feeding and provision of complementary foods on child growth in developing countries

    PubMed Central

    2011-01-01

    Background Childhood undernutrition is prevalent in low and middle income countries. It is an important indirect cause of child mortality in these countries. According to an estimate, stunting (height for age Z score < -2) and wasting (weight for height Z score < -2) along with intrauterine growth restriction are responsible for about 2.1 million deaths worldwide in children < 5 years of age. This comprises 21 % of all deaths in this age group worldwide. The incidence of stunting is the highest in the first two years of life especially after six months of life when exclusive breastfeeding alone cannot fulfill the energy needs of a rapidly growing child. Complementary feeding for an infant refers to timely introduction of safe and nutritional foods in addition to breast-feeding (BF) i.e. clean and nutritionally rich additional foods introduced at about six months of infant age. Complementary feeding strategies encompass a wide variety of interventions designed to improve not only the quality and quantity of these foods but also improve the feeding behaviors. In this review, we evaluated the effectiveness of two most commonly applied strategies of complementary feeding i.e. timely provision of appropriate complementary foods (± nutritional counseling) and education to mothers about practices of complementary feeding on growth. Recommendations have been made for input to the Lives Saved Tool (LiST) model by following standardized guidelines developed by Child Health Epidemiology Reference Group (CHERG). Methods We conducted a systematic review of published randomized and quasi-randomized trials on PubMed, Cochrane Library and WHO regional databases. The included studies were abstracted and graded according to study design, limitations, intervention details and outcome effects. The primary outcomes were change in weight and height during the study period among children 6-24 months of age. We hypothesized that provision of complementary food and education of mother about complementary food would significantly improve the nutritional status of the children in the intervention group compared to control. Meta-analyses were generated for change in weight and height by two methods. In the first instance, we pooled the results to get weighted mean difference (WMD) which helps to pool studies with different units of measurement and that of different duration. A second meta-analysis was conducted to get a pooled estimate in terms of actual increase in weight (kg) and length (cm) in relation to the intervention, for input into the LiST model. Results After screening 3795 titles, we selected 17 studies for inclusion in the review. The included studies evaluated the impact of provision of complementary foods (±nutritional counseling) and of nutritional counseling alone. Both these interventions were found to result in a significant increase in weight [WMD 0.34 SD, 95% CI 0.11 – 0.56 and 0.30 SD, 95 % CI 0.05-0.54 respectively) and linear growth [WMD 0.26 SD, 95 % CI 0.08-0.43 and 0.21 SD, 95 % CI 0.01-0.41 respectively]. Pooled results for actual increase in weight in kilograms and length in centimeters showed that provision of appropriate complementary foods (±nutritional counseling) resulted in an extra gain of 0.25kg (±0.18) in weight and 0.54 cm (±0.38) in height in children aged 6-24 months. The overall quality grades for these estimates were that of ‘moderate’ level. These estimates have been recommended for inclusion in the Lives Saved Tool (LiST) model. Education of mother about complementary feeding led to an extra weight gain of 0.30 kg (±0.26) and a gain of 0.49 cm (±0.50) in height in the intervention group compared to control. These estimates had been recommended for inclusion in the LiST model with an overall quality grade assessment of ‘moderate’ level. Conclusion Provision of appropriate complementary food, with or without nutritional education, and maternal nutritional counseling alone lead to significant increase in weight and height in children 6-24 months of age. These interventions can significantly reduce the risk of stunting in developing countries and are recommended for inclusion in the LiST tool. PMID:21501443

  6. 49 CFR 573.8 - Lists of purchasers, owners, dealers, distributors, lessors, and lessees.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... The list shall include the vehicle identification number for each vehicle and the status of remedy... CFR 577.5(h). The list shall also include the make, model, model year, and vehicle identification..., distributors, lessors, and lessees. (a) Each manufacturer of motor vehicles shall maintain, in a form suitable...

  7. 49 CFR 573.8 - Lists of purchasers, owners, dealers, distributors, lessors, and lessees.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... The list shall include the vehicle identification number for each vehicle and the status of remedy... CFR 577.5(h). The list shall also include the make, model, model year, and vehicle identification..., distributors, lessors, and lessees. (a) Each manufacturer of motor vehicles shall maintain, in a form suitable...

  8. 49 CFR 573.8 - Lists of purchasers, owners, dealers, distributors, lessors, and lessees.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... The list shall include the vehicle identification number for each vehicle and the status of remedy... CFR 577.5(h). The list shall also include the make, model, model year, and vehicle identification..., distributors, lessors, and lessees. (a) Each manufacturer of motor vehicles shall maintain, in a form suitable...

  9. 49 CFR 573.8 - Lists of purchasers, owners, dealers, distributors, lessors, and lessees.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... The list shall include the vehicle identification number for each vehicle and the status of remedy... CFR 577.5(h). The list shall also include the make, model, model year, and vehicle identification..., distributors, lessors, and lessees. (a) Each manufacturer of motor vehicles shall maintain, in a form suitable...

  10. 49 CFR 573.8 - Lists of purchasers, owners, dealers, distributors, lessors, and lessees.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... The list shall include the vehicle identification number for each vehicle and the status of remedy... CFR 577.5(h). The list shall also include the make, model, model year, and vehicle identification..., distributors, lessors, and lessees. (a) Each manufacturer of motor vehicles shall maintain, in a form suitable...

  11. Discriminating response groups in metabolic and regulatory pathway networks.

    PubMed

    Van Hemert, John L; Dickerson, Julie A

    2012-04-01

    Analysis of omics experiments generates lists of entities (genes, metabolites, etc.) selected based on specific behavior, such as changes in response to stress or other signals. Functional interpretation of these lists often uses category enrichment tests using functional annotations like Gene Ontology terms and pathway membership. This approach does not consider the connected structure of biochemical pathways or the causal directionality of events. The Omics Response Group (ORG) method, described in this work, interprets omics lists in the context of metabolic pathway and regulatory networks using a statistical model for flow within the networks. Statistical results for all response groups are visualized in a novel Pathway Flow plot. The statistical tests are based on the Erlang distribution model under the assumption of independent and identically Exponential-distributed random walk flows through pathways. As a proof of concept, we applied our method to an Escherichia coli transcriptomics dataset where we confirmed common knowledge of the E.coli transcriptional response to Lipid A deprivation. The main response is related to osmotic stress, and we were also able to detect novel responses that are supported by the literature. We also applied our method to an Arabidopsis thaliana expression dataset from an abscisic acid study. In both cases, conventional pathway enrichment tests detected nothing, while our approach discovered biological processes beyond the original studies. We created a prototype for an interactive ORG web tool at http://ecoserver.vrac.iastate.edu/pathwayflow (source code is available from https://subversion.vrac.iastate.edu/Subversion/jlv/public/jlv/pathwayflow). The prototype is described along with additional figures and tables in Supplementary Material. julied@iastate.edu Supplementary data are available at Bioinformatics online.

  12. An Overview of the Thermal Challenges of Designing Microgravity Furnaces

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.

    2001-01-01

    Marshall Space Flight Center is involved in a wide variety of microgravity projects that require furnaces, with hot zone temperatures ranging from 300 C to 2300 C, requirements for gradient processing and rapid quench, and both semi-conductor and metal materials. On these types of projects, the thermal engineer is a key player in the design process. Microgravity furnaces present unique challenges to the thermal designer. One challenge is designing a sample containment assembly that achieves dual containment, yet allows a high radial heat flux. Another challenge is providing a high axial gradient but a very low radial gradient. These furnaces also present unique challenges to the thermal analyst. First, there are several orders of magnitude difference in the size of the thermal 'conductors' between various parts of the model. A second challenge is providing high fidelity in the sample model, and connecting the sample with the rest of the furnace model, yet maintaining some sanity in the number of total nodes in the model. The purpose of this paper is to present an overview of the challenges involved in designing and analyzing microgravity furnaces and how some of these challenges have been overcome. The thermal analysis tools presently used to analyze microgravity furnaces and will be listed. Challenges for the future and a description of future analysis tools will be given.

  13. Template-based combinatorial enumeration of virtual compound libraries for lipids

    PubMed Central

    2012-01-01

    A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license. PMID:23006594

  14. [Evaluation of the quality of clinical practice guidelines published in the Annales de Biologie Clinique with the help of the EFLM checklist].

    PubMed

    Wils, Julien; Fonfrède, Michèle; Augereau, Christine; Watine, Joseph

    2014-01-01

    Several tools are available to help evaluate the quality of clinical practice guidelines (CPG). The AGREE instrument (Appraisal of guidelines for research & evaluation) is the most consensual tool but it has been designed to assess CPG methodology only. The European federation of laboratory medicine (EFLM) recently designed a check-list dedicated to laboratory medicine which is supposed to be comprehensive and which therefore makes it possible to evaluate more thoroughly the quality of CPG in laboratory medicine. In the present work we test the comprehensiveness of this check-list on a sample of CPG written in French and published in Annales de biologie clinique (ABC). Thus we show that some work remains to be achieved before a truly comprehensive check-list is designed. We also show that there is some room for improvement for the CPG published in ABC, for example regarding the fact that some of these CPG do not provide any information about allowed durations of transport and of storage of biological samples before analysis, or about standards of minimal analytical performance, or about the sensitivities or the specificities of the recommended tests.

  15. Template-based combinatorial enumeration of virtual compound libraries for lipids.

    PubMed

    Sud, Manish; Fahy, Eoin; Subramaniam, Shankar

    2012-09-25

    A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license.

  16. Highlights of the Workshop

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1997-01-01

    Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.

  17. eBird—Using citizen-science data to help solve real-world conservation challenges (Invited)

    NASA Astrophysics Data System (ADS)

    Sullivan, B. L.; Iliff, M. J.; Wood, C. L.; Fink, D.; Kelling, S.

    2010-12-01

    eBird (www.ebird.org) is an Internet-based citizen-science project that collects bird observations worldwide. eBird is foremost a tool for birders, providing users with a resource for bird information and a way to keep track of their personal bird lists, thus establishing a model for sustained participation and new project growth. Importantly, eBird data are shared with scientists and conservationists working to save birds and their habitats. Here we highlight two different ways these data are used: as a real-time data gathering and visualization tool; and as the primary resource for developing large-scale bird distribution models that explore species-habitat associations and climate change scenarios. eBird provides data across broad temporal and spatial scales, and is a valuable tool for documenting and monitoring bird populations facing a multitude of anthropogenic and environmental impacts. For example, a focused effort to monitor birds on Gulf Coast beaches using eBird is providing essential baseline data and enabling long-term monitoring of bird populations throughout the region. Additionally, new data visualization tools that incorporate data from eBird, NOAA, and Google, are specifically designed to highlight the potential impacts of the Gulf oil spill on bird populations. Through a collaboration of partners in the DataONE network, such as the Oak Ridge National Laboratory, we will use supercomputing time from the National Science Foundation’s TeraGrid to allow Lab scientists to model bird migration phenology at the population level based on eBird data. The process involves combining bird observations with remotely sensed variables such as landcover and greening index to predict bird movements. Preliminary results of these models allow us to animate bird movements across large spatial scales, and to explore how migration timing might be affected under different climate change scenarios.

  18. BANYAN. XI. The BANYAN Σ Multivariate Bayesian Algorithm to Identify Members of Young Associations with 150 pc

    NASA Astrophysics Data System (ADS)

    Gagné, Jonathan; Mamajek, Eric E.; Malo, Lison; Riedel, Adric; Rodriguez, David; Lafrenière, David; Faherty, Jacqueline K.; Roy-Loubier, Olivier; Pueyo, Laurent; Robin, Annie C.; Doyon, René

    2018-03-01

    BANYAN Σ is a new Bayesian algorithm to identify members of young stellar associations within 150 pc of the Sun. It includes 27 young associations with ages in the range ∼1–800 Myr, modeled with multivariate Gaussians in six-dimensional (6D) XYZUVW space. It is the first such multi-association classification tool to include the nearest sub-groups of the Sco-Cen OB star-forming region, the IC 2602, IC 2391, Pleiades and Platais 8 clusters, and the ρ Ophiuchi, Corona Australis, and Taurus star formation regions. A model of field stars is built from a mixture of multivariate Gaussians based on the Besançon Galactic model. The algorithm can derive membership probabilities for objects with only sky coordinates and proper motion, but can also include parallax and radial velocity measurements, as well as spectrophotometric distance constraints from sequences in color–magnitude or spectral type–magnitude diagrams. BANYAN Σ benefits from an analytical solution to the Bayesian marginalization integrals over unknown radial velocities and distances that makes it more accurate and significantly faster than its predecessor BANYAN II. A contamination versus hit rate analysis is presented and demonstrates that BANYAN Σ achieves a better classification performance than other moving group tools available in the literature, especially in terms of cross-contamination between young associations. An updated list of bona fide members in the 27 young associations, augmented by the Gaia-DR1 release, as well as all parameters for the 6D multivariate Gaussian models for each association and the Galactic field neighborhood within 300 pc are presented. This new tool will make it possible to analyze large data sets such as the upcoming Gaia-DR2 to identify new young stars. IDL and Python versions of BANYAN Σ are made available with this publication, and a more limited online web tool is available at http://www.exoplanetes.umontreal.ca/banyan/banyansigma.php.

  19. Electromagnetic field strength prediction in an urban environment: A useful tool for the planning of LMSS

    NASA Technical Reports Server (NTRS)

    Vandooren, G. A. J.; Herben, M. H. A. J.; Brussaard, G.; Sforza, M.; Poiaresbaptista, J. P. V.

    1993-01-01

    A model for the prediction of the electromagnetic field strength in an urban environment is presented. The ray model, that is based on the Uniform Theory of Diffraction (UTD), includes effects of the non-perfect conductivity of the obstacles and their surface roughness. The urban environment is transformed into a list of standardized obstacles that have various shapes and material properties. The model is capable of accurately predicting the field strength in the urban environment by calculating different types of wave contributions such as reflected, edge and corner diffracted waves, and combinations thereof. Also, antenna weight functions are introduced to simulate the spatial filtering by the mobile antenna. Communication channel parameters such as signal fading, time delay profiles, Doppler shifts and delay-Doppler spectra can be derived from the ray-tracing procedure using post-processing routines. The model has been tested against results from scaled measurements at 50 GHz and proves to be accurate.

  20. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  1. Applying Results Findings: The Recovery Potential Project

    EPA Pesticide Factsheets

    The document describes a pilot study using the Illinois 303(d) listed waters, aimed at developing tools and data to help state TMDL and restoration programs decide where best to use their limited restoration resources.

  2. Generality of a congruity effect in judgements of relative order.

    PubMed

    Liu, Yang S; Chan, Michelle; Caplan, Jeremy B

    2014-10-01

    The judgement of relative order (JOR) procedure is used to investigate serial-order memory. Measuring response times, the wording of the instructions (whether the earlier or the later item was designated as the target) reversed the direction of search in subspan lists (Chan, Ross, Earle, & Caplan Psychonomic Bulletin & Review, 16(5), 945-951, 2009). If a similar congruity effect applied to above-span lists and, furthermore, with error rate as the measure, this could suggest how to model order memory across scales. Participants performed JORs on lists of nouns (Experiment 1: list lengths = 4, 6, 8, 10) or consonants (Experiment 2: list lengths = 4, 8). In addition to the usual distance, primacy, and recency effects, instructions interacted with serial position of the later probe in both experiments, not only in response time, but also in error rate, suggesting that availability, not just accessibility, is affected by instructions. The congruity effect challenges current memory models. We fitted Hacker's (Journal of Experimental Psychology: Human Learning and Memory, 6(6), 651-675, 1980) self-terminating search model to our data and found that a switch in search direction could explain the congruity effect for short lists, but not longer lists. This suggests that JORs may need to be understood via direct-access models, adapted to produce a congruity effect, or a mix of mechanisms.

  3. The EU(7)-PIM list: a list of potentially inappropriate medications for older people consented by experts from seven European countries.

    PubMed

    Renom-Guiteras, Anna; Meyer, Gabriele; Thürmann, Petra A

    2015-07-01

    The aim of the study was to develop a European list of potentially inappropriate medications (PIM) for older people, which can be used for the analysis and comparison of prescribing patterns across European countries and for clinical practice. A preliminary PIM list was developed, based on the German PRISCUS list of potentially inappropriate medications and other PIM lists from the USA, Canada and France. Thirty experts on geriatric prescribing from Estonia, Finland, France, the Netherlands, Spain and Sweden participated; eight experts performed a structured expansion of the list, suggesting further medications; twenty-seven experts participated in a two-round Delphi survey assessing the appropriateness of drugs and suggesting dose adjustments and therapeutic alternatives. Finally, twelve experts completed a brief final survey to decide upon issues requiring further consensus. Experts reached a consensus that 282 chemical substances or drug classes from 34 therapeutic groups are PIM for older people; some PIM are restricted to a certain dose or duration of use. The PIM list contains suggestions for dose adjustments and therapeutic alternatives. The European Union (EU)(7)-PIM list is a screening tool, developed with participation of experts from seven European countries, that allows identification and comparison of PIM prescribing patterns for older people across European countries. It can also be used as a guide in clinical practice, although it does not substitute the decision-making process of individualised prescribing for older people. Further research is needed to investigate the feasibility and applicability and, finally, the clinical benefits of the newly developed list.

  4. Spatial heterogeneity in fishing creates de facto refugia for endangered Celtic Sea elasmobranchs.

    PubMed

    Shephard, Samuel; Gerritsen, Hans; Kaiser, Michel J; Reid, David G

    2012-01-01

    The life history characteristics of some elasmobranchs make them particularly vulnerable to fishing mortality; about a third of all species are listed by the IUCN as Threatened or Near Threatened. Marine Protected Areas (MPAs) have been suggested as a tool for conservation of elasmobranchs, but they are likely to be effective only if such populations respond to fishing impacts at spatial-scales corresponding to MPA size. Using the example of the Celtic Sea, we modelled elasmobranch biomass (kg h(-1)) in fisheries-independent survey hauls as a function of environmental variables and 'local' (within 20 km radius) fishing effort (h y(-1)) recorded from Vessel Monitoring Systems data. Model selection using AIC suggested strongest support for linear mixed effects models in which the variables (i) fishing effort, (ii) geographic location and (iii) demersal fish assemblage had approximately equal importance in explaining elasmobranch biomass. In the eastern Celtic Sea, sampling sites that occurred in the lowest 10% of the observed fishing effort range recorded 10 species of elasmobranch including the critically endangered Dipturus spp. The most intensely fished 10% of sites had only three elasmobranch species, with two IUCN listed as Least Concern. Our results suggest that stable spatial heterogeneity in fishing effort creates de facto refugia for elasmobranchs in the Celtic Sea. However, changes in the present fisheries management regime could impair the refuge effect by changing fisher's behaviour and displacing effort into these areas.

  5. Spatial Heterogeneity in Fishing Creates de facto Refugia for Endangered Celtic Sea Elasmobranchs

    PubMed Central

    Shephard, Samuel; Gerritsen, Hans; Kaiser, Michel J.; Reid, David G.

    2012-01-01

    The life history characteristics of some elasmobranchs make them particularly vulnerable to fishing mortality; about a third of all species are listed by the IUCN as Threatened or Near Threatened. Marine Protected Areas (MPAs) have been suggested as a tool for conservation of elasmobranchs, but they are likely to be effective only if such populations respond to fishing impacts at spatial-scales corresponding to MPA size. Using the example of the Celtic Sea, we modelled elasmobranch biomass (kg h−1) in fisheries-independent survey hauls as a function of environmental variables and ‘local’ (within 20 km radius) fishing effort (h y−1) recorded from Vessel Monitoring Systems data. Model selection using AIC suggested strongest support for linear mixed effects models in which the variables (i) fishing effort, (ii) geographic location and (iii) demersal fish assemblage had approximately equal importance in explaining elasmobranch biomass. In the eastern Celtic Sea, sampling sites that occurred in the lowest 10% of the observed fishing effort range recorded 10 species of elasmobranch including the critically endangered Dipturus spp. The most intensely fished 10% of sites had only three elasmobranch species, with two IUCN listed as Least Concern. Our results suggest that stable spatial heterogeneity in fishing effort creates de facto refugia for elasmobranchs in the Celtic Sea. However, changes in the present fisheries management regime could impair the refuge effect by changing fisher's behaviour and displacing effort into these areas. PMID:23166635

  6. TAM: a method for enrichment and depletion analysis of a microRNA category in a list of microRNAs.

    PubMed

    Lu, Ming; Shi, Bing; Wang, Juan; Cao, Qun; Cui, Qinghua

    2010-08-09

    MicroRNAs (miRNAs) are a class of important gene regulators. The number of identified miRNAs has been increasing dramatically in recent years. An emerging major challenge is the interpretation of the genome-scale miRNA datasets, including those derived from microarray and deep-sequencing. It is interesting and important to know the common rules or patterns behind a list of miRNAs, (i.e. the deregulated miRNAs resulted from an experiment of miRNA microarray or deep-sequencing). For the above purpose, this study presents a method and develops a tool (TAM) for annotations of meaningful human miRNAs categories. We first integrated miRNAs into various meaningful categories according to prior knowledge, such as miRNA family, miRNA cluster, miRNA function, miRNA associated diseases, and tissue specificity. Using TAM, given lists of miRNAs can be rapidly annotated and summarized according to the integrated miRNA categorical data. Moreover, given a list of miRNAs, TAM can be used to predict novel related miRNAs. Finally, we confirmed the usefulness and reliability of TAM by applying it to deregulated miRNAs in acute myocardial infarction (AMI) from two independent experiments. TAM can efficiently identify meaningful categories for given miRNAs. In addition, TAM can be used to identify novel miRNA biomarkers. TAM tool, source codes, and miRNA category data are freely available at http://cmbi.bjmu.edu.cn/tam.

  7. New Tools in an Old Trade: Teachers Talk About Use of the Internet in the Teaching of French as a Second or Foreign Language.

    ERIC Educational Resources Information Center

    Murphy, Elizabeth

    2002-01-01

    Presents findings of a study of teachers' beliefs about teaching and learning French as a second or foreign language using the Internet. An international, online discussion list and an open-ended questionnaire provided an opportunity for teachers to talk about their experiences with the new tools of the Internet. Challenges related to the teaching…

  8. Manufacturing process applications team (MATEAM). [technology transfer in the areas of machine tools and robots

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The transfer of NASA technology to the industrial sector is reported. Presentations to the machine tool and robot industries and direct technology transfers of the Adams Manipulator arm, a-c motor control, and the bolt tension monitor are discussed. A listing of proposed RTOP programs with strong potential is included. A detailed description of the rotor technology available to industry is given.

  9. Process Guide for Deburring Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frey, David L.

    This report is an updated and consolidated view of the current deburring processes at the Kansas City Plant (KCP). It includes specific examples of current burr problems and the methods used for their detection. Also included is a pictorial review of the large variety of available deburr tools, along with a complete numerical listing of existing tools and their descriptions. The process for deburring all the major part feature categories is discussed.

  10. Graphical Contingency Analysis for the Nation's Electric Grid

    ScienceCinema

    Zhenyu (Henry) Huang

    2017-12-09

    PNNL has developed a new tool to manage the electric grid more effectively, helping prevent blackouts and brownouts--and possibly avoiding millions of dollars in fines for system violations. The Graphical Contingency Analysis tool monitors grid performance, shows prioritized lists of problems, provides visualizations of potential consequences, and helps operators identify the most effective courses of action. This technology yields faster, better decisions and a more stable and reliable power grid.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    SENSMG is a tool for computing first-order sensitivities of neutron reaction rates, reaction-rate ratios, leakage, k eff, and α using the PARTISN multigroup discrete-ordinates code. SENSMG computes sensitivities to all of the transport cross sections and data (total, fission, nu, chi, and all scattering moments), two edit cross sections (absorption and capture), and the density for every isotope and energy group. It also computes sensitivities to the mass density for every material and derivatives with respect to all interface locations. The tool can be used for one-dimensional spherical (r) and two-dimensional cylindrical (r-z) geometries. The tool can be used formore » fixed-source and eigenvalue problems. The tool implements Generalized Perturbation Theory (GPT) as discussed by Williams and Stacey. Section II of this report describes the theory behind adjoint-based sensitivities, gives the equations that SENSMG solves, and defines the sensitivities that are output. Section III describes the user interface, including the input file and command line options. Section IV describes the output. Section V gives some notes about the coding that may be of interest. Section VI discusses verification, which is ongoing. Section VII lists needs and ideas for future work. Appendix A lists all of the input files whose results are presented in Sec. VI.« less

  12. Proteasix: a tool for automated and large-scale prediction of proteases involved in naturally occurring peptide generation.

    PubMed

    Klein, Julie; Eales, James; Zürbig, Petra; Vlahou, Antonia; Mischak, Harald; Stevens, Robert

    2013-04-01

    In this study, we have developed Proteasix, an open-source peptide-centric tool that can be used to predict in silico the proteases involved in naturally occurring peptide generation. We developed a curated cleavage site (CS) database, containing 3500 entries about human protease/CS combinations. On top of this database, we built a tool, Proteasix, which allows CS retrieval and protease associations from a list of peptides. To establish the proof of concept of the approach, we used a list of 1388 peptides identified from human urine samples, and compared the prediction to the analysis of 1003 randomly generated amino acid sequences. Metalloprotease activity was predominantly involved in urinary peptide generation, and more particularly to peptides associated with extracellular matrix remodelling, compared to proteins from other origins. In comparison, random sequences returned almost no results, highlighting the specificity of the prediction. This study provides a tool that can facilitate linking of identified protein fragments to predicted protease activity, and therefore into presumed mechanisms of disease. Experiments are needed to confirm the in silico hypotheses; nevertheless, this approach may be of great help to better understand molecular mechanisms of disease, and define new biomarkers, and therapeutic targets. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Completely automated modal analysis procedure based on the combination of different OMA methods

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  14. Defending science education against intelligent design: a call to action

    PubMed Central

    Attie, Alan D.; Sober, Elliot; Numbers, Ronald L.; Amasino, Richard M.; Cox, Beth; Berceau, Terese; Powell, Thomas; Cox, Michael M.

    2006-01-01

    We review here the current political landscape and our own efforts to address the attempts to undermine science education in Wisconsin. To mount an effective response, expertise in evolutionary biology and in the history of the public controversy is useful but not essential. However, entering the fray requires a minimal tool kit of information. Here, we summarize some of the scientific and legal history of this issue and list a series of actions that scientists can take to help facilitate good science education and an improved atmosphere for the scientific enterprise nationally. Finally, we provide some model legislation that has been introduced in Wisconsin to strengthen the teaching of science. PMID:16670753

  15. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less

  16. A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines

    NASA Astrophysics Data System (ADS)

    Pennacchi, P.; Borghesani, P.; Chatterton, S.

    2015-08-01

    Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.

  17. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  18. [A model list of high risk drugs].

    PubMed

    Cotrina Luque, J; Guerrero Aznar, M D; Alvarez del Vayo Benito, C; Jimenez Mesa, E; Guzman Laura, K P; Fernández Fernández, L

    2013-12-01

    «High-risk drugs» are those that have a very high «risk» of causing death or serious injury if an error occurs during its use. The Institute for Safe Medication Practices (ISMP) has prepared a high-risk drugs list applicable to the general population (with no differences between the pediatric and adult population). Thus, there is a lack of information for the pediatric population. The main objective of this work is to develop a high-risk drug list adapted to the neonatal or pediatric population as a reference model for the pediatric hospital health workforce. We made a literature search in May 2012 to identify any published lists or references in relation to pediatric and/or neonatal high-risk drugs. A total of 15 studies were found, from which 9 were selected. A model list was developed mainly based on the ISMP one, adding strongly perceived pediatric risk drugs and removing those where the pediatric use was anecdotal. There is no published list that suits pediatric risk management. The list of pediatric and neonatal high-risk drugs presented here could be a «reference list of high-risk drugs » for pediatric hospitals. Using this list and training will help to prevent medication errors in each drug supply chain (prescribing, transcribing, dispensing and administration). Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  19. The NASA/MSFC global reference atmospheric model: 1990 version (GRAM-90). Part 2: Program/data listings

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Alyea, F. N.; Cunnold, D. M.; Jeffries, W. R., III; Johnson, D. L.

    1991-01-01

    A new (1990) version of the NASA/MSFC Global Reference Atmospheric Model (GRAM-90) was completed and the program and key data base listing are presented. GRAM-90 incorporate extensive new data, mostly collected under the Middle Atmosphere Program, to produce a completely revised middle atmosphere model (20 to 120 km). At altitudes greater than 120 km, GRAM-90 uses the NASA Marshall Engineering Thermosphere model. Complete listings of all program and major data bases are presented. Also, a test case is included.

  20. CBP PHASE I CODE INTEGRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Brown, K.; Flach, G.

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material propertiesmore » via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was developed to link GoldSim with external codes (Smith III et al. 2010). The DLL uses a list of code inputs provided by GoldSim to create an input file for the external application, runs the external code, and returns a list of outputs (read from files created by the external application) back to GoldSim. In this way GoldSim provides: (1) a unified user interface to the applications, (2) the capability of coupling selected codes in a synergistic manner, and (3) the capability of performing probabilistic uncertainty analysis with the codes. GoldSim is made available by the GoldSim Technology Group as a free 'Player' version that allows running but not editing GoldSim models. The player version makes the software readily available to a wider community of users that would wish to use the CBP application but do not have a license for GoldSim.« less

  1. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  2. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less

  3. Science Opportunity Analyzer (SOA): Science Planning Made Simple

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Polanskey, Carol A.

    2004-01-01

    .For the first time at JPL, the Cassini mission to Saturn is using distributed science operations for developing their experiments. Remote scientists needed the ability to: a) Identify observation opportunities; b) Create accurate, detailed designs for their observations; c) Verify that their designs meet their objectives; d) Check their observations against project flight rules and constraints; e) Communicate their observations to other scientists. Many existing tools provide one or more of these functions, but Science Opportunity Analyzer (SOA) has been built to unify these tasks into a single application. Accurate: Utilizes JPL Navigation and Ancillary Information Facility (NAIF) SPICE* software tool kit - Provides high fidelity modeling. - Facilitates rapid adaptation to other flight projects. Portable: Available in Unix, Windows and Linux. Adaptable: Designed to be a multi-mission tool so it can be readily adapted to other flight projects. Implemented in Java, Java 3D and other innovative technologies. Conclusion: SOA is easy to use. It only requires 6 simple steps. SOA's ability to show the same accurate information in multiple ways (multiple visualization formats, data plots, listings and file output) is essential to meet the needs of a diverse, distributed science operations environment.

  4. Software Comparison for Renewable Energy Deployment in a Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less

  5. Trends in Wait-list Mortality in Children Listed for Heart Transplantation in the United States

    PubMed Central

    Singh, Tajinder P.; Almond, Christopher S.; Piercey, Gary; Gauvreau, Kimberlee

    2014-01-01

    We sought to evaluate trends in overall and race-specific pediatric heart transplant (HT) wait-list mortality in the United States (US) during the last 20 years. We identified all children <18 years old listed for primary HT in the US during 1989–2009 (N=8096, 62% white, 19% black, 13% Hispanic, 6% other) using the Organ Procurement and Transplant Network database. Wait-list mortality was assessed in 4 successive eras (1989–1994, 1995–1999, 2000–2004, and 2005–2009). Overall wait-list mortality declined in successive eras (26%, 23%, 18% and 13%, respectively). The decline across eras remained significant in adjusted analysis (hazard ratio [HR] 0.70 in successive eras, 95% confidence interval [CI] 0.67, 0.74) and was 67% lower for children listed during 2005–2009 vs. those listed during 1989–1994 (HR 0.33, CI 0.28, 0.39). In models stratified by race, wait-list mortality decreased in all racial groups in successive eras. In models stratified by era, minority children were not at higher risk of wait-list mortality in the most recent era. We conclude that the risk of wait-list mortality among US children listed for HT has decreased by two-thirds during the last 20 years. Racial gaps in wait-list mortality present variably in the past are not present in the current era. PMID:21883920

  6. Reference Tools for Data Processing, Office Automation, and Data Communications: An Introductory Guide.

    ERIC Educational Resources Information Center

    Cupoli, Patricia Dymkar

    1981-01-01

    Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)

  7. Is My Facility a Good Candidate for CHP?

    EPA Pesticide Factsheets

    Learn if a facility is a good candidate for CHP by answering a list of questions, and access the CHP Spark Spread Estimator, a tool that helps evaluate a prospective CHP system for its potential economic feasibility.

  8. The 10 Hottest Technologies in Telecom.

    ERIC Educational Resources Information Center

    Flanagan, Patrick

    1997-01-01

    Presents the fourth annual listing of the 10 "hottest" telecommunications technologies. Describes Web broadcasting, remote-access servers, extranets, Internet telephony, enterprise network directory services, Web site management tools, IP (Internet Protocols) switching, wavelength division multiplexing, digital subscriber lines, and…

  9. Finding the Right Educational Software for Your Child.

    ERIC Educational Resources Information Center

    Moore, Jack

    1990-01-01

    Ideas are presented for identifying, evaluating, and selecting instructional software for children with special needs. The article notes several library research tools as sources of information and lists specific questions to consider when evaluating software. (JDD)

  10. Effectiveness of Multimedia for Transplant Preparation for Kidney Transplant Waiting List Patients.

    PubMed

    Charoenthanakit, C; Junchotikul, P; Sittiudomsuk, R; Saiyud, A; Pratumphai, P

    2016-04-01

    A multimedia program could effectively advise patients about preparing for transplantation while on the waiting list for a kidney transplant. This study aimed to compare knowledge about transplant preparation for patients on a kidney transplant waiting list before and after participating in a multimedia program, and to evaluate patient satisfaction with the multimedia program. Research design was quasiexperimental with the use of 1 group. Subjects were 186 patients on the kidney transplant waiting list after HLA matching in Ramathibodi Hospital. The questionnaires were developed by the researchers. The statistical tools used were basic statistics, percentage, average, standard deviation, and the difference of score between before and after participation in the multimedia program (t test). The evaluation knowledge for transplant preparation for kidney transplant waiting list patients after participating in the multimedia program averaged 85.40%, and there was an increased improvement of score by an average 3.27 out of a possible full score of 20 (P < .05). The result of patient satisfaction for the multimedia program had good average, 4.58. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Database resources for the tuberculosis community.

    PubMed

    Lew, Jocelyne M; Mao, Chunhong; Shukla, Maulik; Warren, Andrew; Will, Rebecca; Kuznetsov, Dmitry; Xenarios, Ioannis; Robertson, Brian D; Gordon, Stephen V; Schnappinger, Dirk; Cole, Stewart T; Sobral, Bruno

    2013-01-01

    Access to online repositories for genomic and associated "-omics" datasets is now an essential part of everyday research activity. It is important therefore that the Tuberculosis community is aware of the databases and tools available to them online, as well as for the database hosts to know what the needs of the research community are. One of the goals of the Tuberculosis Annotation Jamboree, held in Washington DC on March 7th-8th 2012, was therefore to provide an overview of the current status of three key Tuberculosis resources, TubercuList (tuberculist.epfl.ch), TB Database (www.tbdb.org), and Pathosystems Resource Integration Center (PATRIC, www.patricbrc.org). Here we summarize some key updates and upcoming features in TubercuList, and provide an overview of the PATRIC site and its online tools for pathogen RNA-Seq analysis. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. CoPub: a literature-based keyword enrichment tool for microarray data analysis.

    PubMed

    Frijters, Raoul; Heupers, Bart; van Beek, Pieter; Bouwhuis, Maurice; van Schaik, René; de Vlieg, Jacob; Polman, Jan; Alkema, Wynand

    2008-07-01

    Medline is a rich information source, from which links between genes and keywords describing biological processes, pathways, drugs, pathologies and diseases can be extracted. We developed a publicly available tool called CoPub that uses the information in the Medline database for the biological interpretation of microarray data. CoPub allows batch input of multiple human, mouse or rat genes and produces lists of keywords from several biomedical thesauri that are significantly correlated with the set of input genes. These lists link to Medline abstracts in which the co-occurring input genes and correlated keywords are highlighted. Furthermore, CoPub can graphically visualize differentially expressed genes and over-represented keywords in a network, providing detailed insight in the relationships between genes and keywords, and revealing the most influential genes as highly connected hubs. CoPub is freely accessible at http://services.nbic.nl/cgi-bin/copub/CoPub.pl.

  13. Pacific Fleet Regional Inventory Stocking Model (PRISM)

    DTIC Science & Technology

    2003-06-01

    Fleet Inventory Management Form ..........................................................................99 19. Master Parts List Input Form...100 20. Master Parts List Update Form...107 26. Master Parts List by APL Report..............................................................................109 27. Master

  14. In vitro models for the prediction of in vivo performance of oral dosage forms.

    PubMed

    Kostewicz, Edmund S; Abrahamsson, Bertil; Brewster, Marcus; Brouwers, Joachim; Butler, James; Carlert, Sara; Dickinson, Paul A; Dressman, Jennifer; Holm, René; Klein, Sandra; Mann, James; McAllister, Mark; Minekus, Mans; Muenster, Uwe; Müllertz, Anette; Verwei, Miriam; Vertzoni, Maria; Weitschies, Werner; Augustijns, Patrick

    2014-06-16

    Accurate prediction of the in vivo biopharmaceutical performance of oral drug formulations is critical to efficient drug development. Traditionally, in vitro evaluation of oral drug formulations has focused on disintegration and dissolution testing for quality control (QC) purposes. The connection with in vivo biopharmaceutical performance has often been ignored. More recently, the switch to assessing drug products in a more biorelevant and mechanistic manner has advanced the understanding of drug formulation behavior. Notwithstanding this evolution, predicting the in vivo biopharmaceutical performance of formulations that rely on complex intraluminal processes (e.g. solubilization, supersaturation, precipitation…) remains extremely challenging. Concomitantly, the increasing demand for complex formulations to overcome low drug solubility or to control drug release rates urges the development of new in vitro tools. Development and optimizing innovative, predictive Oral Biopharmaceutical Tools is the main target of the OrBiTo project within the Innovative Medicines Initiative (IMI) framework. A combination of physico-chemical measurements, in vitro tests, in vivo methods, and physiology-based pharmacokinetic modeling is expected to create a unique knowledge platform, enabling the bottlenecks in drug development to be removed and the whole process of drug development to become more efficient. As part of the basis for the OrBiTo project, this review summarizes the current status of predictive in vitro assessment tools for formulation behavior. Both pharmacopoeia-listed apparatus and more advanced tools are discussed. Special attention is paid to major issues limiting the predictive power of traditional tools, including the simulation of dynamic changes in gastrointestinal conditions, the adequate reproduction of gastrointestinal motility, the simulation of supersaturation and precipitation, and the implementation of the solubility-permeability interplay. It is anticipated that the innovative in vitro biopharmaceutical tools arising from the OrBiTo project will lead to improved predictions for in vivo behavior of drug formulations in the GI tract. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. MRO Sequence Checking Tool

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.

  16. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    This draft report of Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. Topics covered include:the types of data required use of PBPK models in risk assessment,evaluation of PBPK models for use in risk assessment, andthe application of these models to address uncertainties resulting from extrapolations (e.g. interspecies extrapolation) often used in risk assessment.In addition, appendices are provided that includea compilation of chemical partition coefficients and rate constants,algorithms for estimating chemical-specific parameters, anda list of publications relating to PBPK modeling. This report is primarily meant to serve as a learning tool for EPA scientists and risk assessors who may be less familiar with the field. In addition, this report can be informative to PBPK modelers within and outside the Agency, as it provides an assessment of the types of data and models that the EPA requires for consideration of a model for use in risk assessment.

  17. Modeling the interdependent network based on two-mode networks

    NASA Astrophysics Data System (ADS)

    An, Feng; Gao, Xiangyun; Guan, Jianhe; Huang, Shupei; Liu, Qian

    2017-10-01

    Among heterogeneous networks, there exist obviously and closely interdependent linkages. Unlike existing research primarily focus on the theoretical research of physical interdependent network model. We propose a two-layer interdependent network model based on two-mode networks to explore the interdependent features in the reality. Specifically, we construct a two-layer interdependent loan network and develop several dependent features indices. The model is verified to enable us to capture the loan dependent features of listed companies based on loan behaviors and shared shareholders. Taking Chinese debit and credit market as case study, the main conclusions are: (1) only few listed companies shoulder the main capital transmission (20% listed companies occupy almost 70% dependent degree). (2) The control of these key listed companies will be more effective of avoiding the spreading of financial risks. (3) Identifying the companies with high betweenness centrality and controlling them could be helpful to monitor the financial risk spreading. (4) The capital transmission channel among Chinese financial listed companies and Chinese non-financial listed companies are relatively strong. However, under greater pressure of demand of capital transmission (70% edges failed), the transmission channel, which constructed by debit and credit behavior, will eventually collapse.

  18. Smartphone tool to collect repeated 24 h dietary recall data in Nepal.

    PubMed

    Harris-Fry, Helen; Beard, B James; Harrisson, Tom; Paudel, Puskar; Shrestha, Niva; Jha, Sonali; Shrestha, Bhim P; Manandhar, Dharma S; Costello, Anthony; Saville, Naomi M

    2018-02-01

    To outline the development of a smartphone-based tool to collect thrice-repeated 24 h dietary recall data in rural Nepal, and to describe energy intakes, common errors and researchers' experiences using the tool. We designed a novel tool to collect multi-pass 24 h dietary recalls in rural Nepal by combining the use of a CommCare questionnaire on smartphones, a paper form, a QR (quick response)-coded list of foods and a photographic atlas of portion sizes. Twenty interviewers collected dietary data on three non-consecutive days per respondent, with three respondents per household. Intakes were converted into nutrients using databases on nutritional composition of foods, recipes and portion sizes. Dhanusha and Mahottari districts, Nepal. Pregnant women, their mothers-in-law and male household heads. Energy intakes assessed in 150 households; data corrections and our experiences reported from 805 households and 6765 individual recalls. Dietary intake estimates gave plausible values, with male household heads appearing to have higher energy intakes (median (25th-75th centile): 12 079 (9293-14 108) kJ/d) than female members (8979 (7234-11 042) kJ/d for pregnant women). Manual editing of data was required when interviewers mistook portions for food codes and for coding items not on the food list. Smartphones enabled quick monitoring of data and interviewer performance, but we initially faced technical challenges with CommCare forms crashing. With sufficient time dedicated to development and pre-testing, this novel smartphone-based tool provides a useful method to collect data. Future work is needed to further validate this tool and adapt it for other contexts.

  19. Toward competency-based curriculum: Application of workplace-based assessment tools in the National Saudi Arabian Anesthesia Training Program.

    PubMed

    Boker, Ama

    2016-01-01

    The anesthesia training program of the Saudi Commission for health specialties has introduced a developed competency-based anesthesia residency program starting from 2015 with the utilization of the workplace-based assessment (WBA) tools, namely mini-clinical exercises (mini-CEX), direct observation of procedural skills (DOPS), and case-based discussion (CBD). This work aimed to describe the process of development of anesthesia-specific list of mini-CEX, DOPS, and CBD tools within the Saudi Arabian Anesthesia Training Programs. To introduce the main concepts of formative WBA tools and to develop anesthesia-specific applications for each of the selected WBA tools, four 1-day workshops were held at the level of major training committees at eastern (Dammam), western (Jeddah), and central (Riyadh) regions in the Kingdom were conducted. Sixty-seven faculties participated in these workshops. After conduction of the four workshops, the anesthesia-specific applications setting of mini-CEX, DOPS, and CBD tools among the 5-year levels were fully described. The level of the appropriate consultation skills was divided according to the case complexity adopted from the American Society of Anesthesiologists physical classification for adult and obstetric and pediatric patient as well as the type of the targeted anesthetic procedure. WBA anesthesia-specific lists of mini-CEX, DOPS, and CBD forms were easily incorporated first into guidelines to help the first stage of implementation of formative assessment in the Saudi Arabian Anesthesia Residency Program, and this can be helpful to replicate such program within other various training programs in Saudi Arabia and abroad.

  20. Intervention mapping for development of a participatory return-to-work intervention for temporary agency workers and unemployed workers sick-listed due to musculoskeletal disorders

    PubMed Central

    Vermeulen, Sylvia J; Anema, Johannes R; Schellart, Antonius JM; van Mechelen, Willem; van der Beek, Allard J

    2009-01-01

    Background In the past decade in activities aiming at return-to-work (RTW), there has been a growing awareness to change the focus from sickness and work disability to recovery and work ability. To date, this process in occupational health care (OHC) has mainly been directed towards employees. However, within the working population there are two vulnerable groups: temporary agency workers and unemployed workers, since they have no workplace/employer to return to, when sick-listed. For this group there is a need for tailored RTW strategies and interventions. Therefore, this paper aims to describe the structured and stepwise process of development, implementation and evaluation of a theory- and practise-based participatory RTW program for temporary agency workers and unemployed workers, sick-listed due to musculoskeletal disorders (MSD). This program is based on the already developed and cost-effective RTW program for employees, sick-listed due to low back pain. Methods The Intervention Mapping (IM) protocol was used to develop a tailor-made RTW program for temporary agency workers and unemployed workers, sick-listed due to MSD. The Attitude-Social influence-self-Efficacy (ASE) model was used as a theoretical framework for determinants of behaviour regarding RTW of the sick-listed worker and development of the intervention. To ensure participation and facilitate successful adoption and implementation, important stakeholders were involved in all steps of program development and implementation. Results of semi-structured interviews and 'fine-tuning' meetings were used to design the final participatory RTW program. Results A structured stepwise RTW program was developed, aimed at making a consensus-based RTW implementation plan. The new program starts with identifying obstacles for RTW, followed by a brainstorm session in which the sick-listed worker and the labour expert of the Social Security Agency (SSA) formulate solutions/possibilities for suitable (therapeutic) work. This process is guided by an independent RTW coordinator to achieve consensus. Based on the resulting RTW implementation plan, to create an actual RTW perspective, a vocational rehabilitation agency is assigned to find a matching (therapeutic) workplace. The cost-effectiveness of this participatory RTW program will be evaluated in a randomised controlled trial. Conclusion IM is a promising tool for the development of tailor-made OHC interventions for the vulnerable working population. PMID:19573229

  1. Atmospheric Photochemical Modeling of Turbine Engine Fuels and Exhausts. Phase 2. Computer Model Development. Volume 2

    DTIC Science & Technology

    1988-05-01

    represented name Emitted Organics Included in All Models CO Carbon Monoxide C:C, Ethene HCHO Formaldehyde CCHO Acetaldehyde RCHO Propionaldehyde and other...of species in the mixture, and for proper use of this program, these files should be "normalized," i.e., the number of carbons in the mixture should...scenario in memory. Valid parmtypes are SCEN, PHYS, CHEM, VP, NSP, OUTP, SCHEDS. LIST ALLCOMP Lists all available composition filenames. LIST ALLSCE

  2. Evaluation of RSDL, M291 SDK, 0.5 Bleach, 1% Soapy Water and SERPACWA: Part 11: Challenge with EA4243 (VR, Russian VX)

    DTIC Science & Technology

    2016-01-01

    listed decontamination products in the haired guinea pig model following exposure to VR (Russian VX, EA4243). 15. SUBJECT TERMS decontamination...the efficacy of the barrier skin cream SERPACWA and the four listed decontamination products in the haired guinea pig model following exposure to VR...four listed decontamination products and SERPACWA in the haired guinea pig model following exposure to VR (Russian VX, EA4243, Soviet V-gas

  3. A Breakdown, Application, and Evaluation of the Resiliency Analysis Support Tool (RAST) from the Operator’s Perspective

    DTIC Science & Technology

    2013-04-17

    44  3.  Pre-Crisis View— List View ..............................................................45  B.  IMPLEMENTATION OF RAST IN NEPAL —POST...íÉ=pÅÜççä= LIST OF FIGURES Figure 1.  Number of Natural Disasters Reported From 1975–2011 (EM-DAT, 2012) ....2  Figure 2.  Snapshot of RAST in the...locations of those resources, and the number of data sets that the RAST has acquired to quantify the scoring of Nepal . Figure 18 shows the Display

  4. The 1985-86 NASA space/gravitational biology accomplishments

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Individual Technical summaries of research projects of NASA's Space/Gravitational Biology Program are presented. This Program is concerned with using the unique characteristics of the space environment, particularly microgravity, as a tool to advance knowledge in the biological sciences; understanding how gravity has shaped and affected life on Earth; and understanding how the space environment affects both plant and animal species. The summaries for each project include a description of the research, a listing of the accomplishments, an explanation of the significance of the accomplishments, and a list of publications.

  5. Network of TAMCNS: Identifying Influence Regions Within the GCSS-MC Database

    DTIC Science & Technology

    2017-06-01

    relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This... methodology identifies the most important TAMCN and provides a list of TAMCNs in order of importance. We also analyze the community and core structure of...relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This

  6. Actual and potential use of population viability analyses in recovery of plant species listed under the US endangered species act.

    PubMed

    Zeigler, Sara L; Che-Castaldo, Judy P; Neel, Maile C

    2013-12-01

    Use of population viability analyses (PVAs) in endangered species recovery planning has been met with both support and criticism. Previous reviews promote use of PVA for setting scientifically based, measurable, and objective recovery criteria and recommend improvements to increase the framework's utility. However, others have questioned the value of PVA models for setting recovery criteria and assert that PVAs are more appropriate for understanding relative trade-offs between alternative management actions. We reviewed 258 final recovery plans for 642 plants listed under the U.S. Endangered Species Act to determine the number of plans that used or recommended PVA in recovery planning. We also reviewed 223 publications that describe plant PVAs to assess how these models were designed and whether those designs reflected previous recommendations for improvement of PVAs. Twenty-four percent of listed species had recovery plans that used or recommended PVA. In publications, the typical model was a matrix population model parameterized with ≤5 years of demographic data that did not consider stochasticity, genetics, density dependence, seed banks, vegetative reproduction, dormancy, threats, or management strategies. Population growth rates for different populations of the same species or for the same population at different points in time were often statistically different or varied by >10%. Therefore, PVAs parameterized with underlying vital rates that vary to this degree may not accurately predict recovery objectives across a species' entire distribution or over longer time scales. We assert that PVA, although an important tool as part of an adaptive-management program, can help to determine quantitative recovery criteria only if more long-term data sets that capture spatiotemporal variability in vital rates become available. Lacking this, there is a strong need for viable and comprehensive methods for determining quantitative, science-based recovery criteria for endangered species with minimal data availability. Uso Actual y Potencial del Análisis de Viabilidad Poblacional para la Recuperación de Especies de Plantas Enlistadas en el Acta de Especies En Peligro de E.U.A. © 2013 Society for Conservation Biology.

  7. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  8. Circular displays: control/display arrangements and stereotype strength with eight different display locations.

    PubMed

    Chan, Alan H S; Hoffmann, Errol R

    2015-01-01

    Two experiments are reported that were designed to investigate control/display arrangements having high stereotype strengths when using circular displays. Eight display locations relative to the operator and control were tested with rotational and translational controls situated on different planes according to the Frame of Reference Transformation Tool (FORT) model of Wickens et al. (2010). (Left. No, Right! Development of the Frame of Reference Transformation Tool (FORT), Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting, 54: 1022-1026). In many cases, there was little effect of display locations, indicating the importance of the Worringham and Beringer (1998. Directional stimulus-response compatibility: a test of three alternative principles. Ergonomics, 41(6), 864-880) Visual Field principle and an extension of this principle for rotary controls (Hoffmann and Chan (2013). The Worringham and Beringer 'visual field' principle for rotary controls. Ergonomics, 56(10), 1620-1624). The initial indicator position (12, 3, 6 and 9 o'clock) had a major effect on control/display stereotype strength for many of the six controls tested. Best display/control arrangements are listed for each of the different control types (rotational and translational) and for the planes on which they are mounted. Data have application where a circular display is used due to limited display panel space and applies to space-craft, robotics operators, hospital equipment and home appliances. Practitioner Summary: Circular displays are often used when there is limited space available on a control panel. Display/control arrangements having high stereotype strength are listed for four initial indicator positions. These arrangements are best for design purposes.

  9. National Space Science Data Center (NSSDC) Data Listing

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Satellite and nonsatellite data available from the National Space Science Data Center are listed. The Satellite Data listing includes the spacecraft name, launch date, and an alphabetical list of experiments. The Non-Satellite Data listing contains ground based data, models, computer routines, and composite spacecraft data. The data set name, data form code, quantity of data, and the time space covered are included in the data sets of both listings where appropriate. Geodetic tracking data sets are also included.

  10. Terminology for Neuroscience Data Discovery: Multi-tree Syntax and Investigator-Derived Semantics

    PubMed Central

    Goldberg, David H.; Grafstein, Bernice; Robert, Adrian; Gardner, Esther P.

    2009-01-01

    The Neuroscience Information Framework (NIF), developed for the NIH Blueprint for Neuroscience Research and available at http://nif.nih.gov and http://neurogateway.org, is built upon a set of coordinated terminology components enabling data and web-resource description and selection. Core NIF terminologies use a straightforward syntax designed for ease of use and for navigation by familiar web interfaces, and readily exportable to aid development of relational-model databases for neuroscience data sharing. Datasets, data analysis tools, web resources, and other entities are characterized by multiple descriptors, each addressing core concepts, including data type, acquisition technique, neuroanatomy, and cell class. Terms for each concept are organized in a tree structure, providing is-a and has-a relations. Broad general terms near each root span the category or concept and spawn more detailed entries for specificity. Related but distinct concepts (e.g., brain area and depth) are specified by separate trees, for easier navigation than would be required by graph representation. Semantics enabling NIF data discovery were selected at one or more workshops by investigators expert in particular systems (vision, olfaction, behavioral neuroscience, neurodevelopment), brain areas (cerebellum, thalamus, hippocampus), preparations (molluscs, fly), diseases (neurodegenerative disease), or techniques (microscopy, computation and modeling, neurogenetics). Workshop-derived integrated term lists are available Open Source at http://brainml.org; a complete list of participants is at http://brainml.org/workshops. PMID:18958630

  11. Three-dimensional structural representation of the sleep-wake adaptability.

    PubMed

    Putilov, Arcady A

    2016-01-01

    Various characteristics of the sleep-wake cycle can determine the success or failure of individual adjustment to certain temporal conditions of the today's society. However, it remains to be explored how many such characteristics can be self-assessed and how they are inter-related one to another. The aim of the present report was to apply a three-dimensional structural representation of the sleep-wake adaptability in the form of "rugby cake" (scalene or triaxial ellipsoid) to explain the results of analysis of the pattern of correlations of the responses to the initial 320-item list of a new inventory with scores on the six scales designed for multidimensional self-assessment of the sleep-wake adaptability (Morning and Evening Lateness, Anytime and Nighttime Sleepability, and Anytime and Daytime Wakeability). The results obtained for sample consisting of 149 respondents were confirmed by the results of similar analysis of earlier collected responses of 139 respondents to the same list of 320 items and responses of 1213 respondents to the 72 items of one of the earlier established questionnaire tools. Empirical evidence was provided in support of the model-driven prediction of the possibility to identify items linked to as many as 36 narrow (6 core and 30 mixed) adaptabilities of the sleep-wake cycle. The results enabled the selection of 168 items for self-assessment of all these adaptabilities predicted by the rugby cake model.

  12. Managing Florida's fracture critical bridges - phases 1 and 2 [summary].

    DOT National Transportation Integrated Search

    2016-05-01

    Florida International University researchers : examined the possibility of removing twin steel : box-girder bridges from the list of fracture critical : structures. They studied the behavior of steel twin : box-girder bridges and developed a tool to ...

  13. DBMS as a Tool for Project Management

    NASA Technical Reports Server (NTRS)

    Linder, H.

    1984-01-01

    Scientific objectives of crustal dynamics are listed as well as the contents of the centralized data information system for the crustal dynamics project. The system provides for project observation schedules, gives project configuration control information and project site information.

  14. Complexity and Chaos - State-of-the-Art; List of Works, Experts, Organizations, Projects, Journals, Conferences and Tools

    DTIC Science & Technology

    2007-09-01

    Adaptive Systems............................................. 64 3.9 Connectivity and Communication in Complex Adaptive Systems...450 3.10.6 Human Factors: Perception, Comprehension, Communication and Collaboration...288 B.9 Catastrophe, Conflict, Crisis

  15. SITE-SPECIFIC DIAGNOSTIC TOOLS

    EPA Science Inventory

    US EPA's Office of Water is proposing Combined Assessment and Listing Methods (CALM) to
    meet reporting requirements under both Sections 305b and 303d for chemical and nonchemical
    stressors in the nation's waterbodies. Current Environmental Monitoring and Assessment
    Prog...

  16. Ingredientes Farmacéuticos Activos Potencialmente Inapropiados en Adultos Mayores: Lista IFAsPIAM: Panel de Consenso Argentino.

    PubMed

    Marzi, Marta M; Pires, Miryam S; Quaglia, Nora B

    2018-04-18

    To perform a list agreed by Argentinean experts and adapted to the local context containing potentially inappropriate (PI) medications in old people (OP) usingthe Delphi consensus technique optimized for this subject. A preliminary list of potentially inappropriate medications (PIM) was drawn up based on foreign PIM lists and a selective search in the scientific literature. The iterative Delphi process was used to submit the active pharmaceutical ingredients (APIs) of the preliminary PIM list to the panel of Argentinean experts. The analysis of theanswers to determine the arrival to the consensus was carried out applying three criteria specially defined for this purpose. After two Delphi rounds, it was not reached agreement about 12 APIs. The List of explicit criteria for PIAPIs for use in OP (IFAsPIAM List) was finally constituted by 128 APIs corresponding to 9 groups of the ATC classification system to which they were organized. In addition to each API, information justifying the unfavorable benefit/risk profile and therapeutic alternatives or recommendations/precautions was recorded. The group with the most PI APIs was N (NervousSystem) (60; 47%) followed by groups C (Cardiovascular) and M (Musculoskeletal). This study presents the first Latin American list of PIM in OP developed using an expert consensus technique. The IFAs PIAM List would contribute to the rational use of drugs in elderly population, constituting a valuable tool in Argentinean public health. Copyright © 2018. Published by Elsevier Inc.

  17. Do screencasts help to revise prerequisite mathematics? An investigation of student performance and perception

    NASA Astrophysics Data System (ADS)

    Loch, Birgit; Jordan, Camilla R.; Lowe, Tim W.; Mestel, Ben D.

    2014-02-01

    Basic calculus skills that are prerequisites for advanced mathematical studies continue to be a problem for a significant proportion of higher education students. While there are many types of revision material that could be offered to students, in this paper we investigate whether short, narrated video recordings of mathematical explanations (screencasts) are a useful tool to enhance student learning when revisiting prerequisite topics. We report on the outcomes of a study that was designed to both measure change in student performance before and after watching screencasts, and to capture students' perception of the usefulness of screencasts in their learning. Volunteers were recruited from students enrolled on an entry module for the Mathematics Master of Science programme at the Open University to watch two screencasts sandwiched between two online calculus quizzes. A statistical analysis of student responses to the quizzes shows that screencasts can have a positive effect on student performance. Further analysis of student feedback shows that student confidence was increased by watching the screencasts. Student views on the value of screencasts for their learning indicated that they appreciated being able to watch a problem being solved and explained by an experienced mathematician; hear the motivation for a particular problem-solving approach; engage more readily with the material being presented, thereby retaining it more easily. The positive student views and impact on student scores indicate that short screencasts could play a useful role in revising prerequisite mathematics.

  18. Strategic Regulatory Evaluation and Endorsement of the Hollow Fiber Tuberculosis System as a Novel Drug Development Tool.

    PubMed

    Romero, Klaus; Clay, Robert; Hanna, Debra

    2015-08-15

    The first nonclinical drug development tool (DDT) advanced by the Critical Path to TB Drug Regimens (CPTR) Initiative through a regulatory review process has been endorsed by leading global regulatory authorities. DDTs with demonstrated predictive accuracy for clinical and microbiological outcomes are needed to support decision making. Regulatory endorsement of these DDTs is critical for drug developers, as it promotes confidence in their use in Investigational New Drug and New Drug Application filings. The in vitro hollow fiber system model of tuberculosis (HFS-TB) is able to recapitulate concentration-time profiles (exposure) observed in patients for single drugs and combinations, by evaluating exposure measures for the ability to kill tuberculosis in different physiologic conditions. Monte Carlo simulations make this quantitative output useful to inform susceptibility breakpoints, dosage, and optimal combination regimens in patients, and to design nonclinical experiments in animal models. The Pre-Clinical and Clinical Sciences Working Group within CPTR executed an evidence-based evaluation of the HFS-TB for predictive accuracy. This extensive effort was enabled through the collaboration of subject matter experts representing the pharmaceutical industry, academia, product development partnerships, and regulatory authorities including the Food and Drug Administration (FDA) and the European Medicines Agency (EMA). A comprehensive analysis plan following the regulatory guidance documents for DDT qualification was developed, followed by individual discussions with the FDA and the EMA. The results from the quantitative analyses were submitted to both agencies, pursuing regulatory DDT endorsement. The EMA Qualification Opinion for the HFS-TB DDT was published 26 January 2015 (available at: http://www.ema.europa.eu/ema/index.jsp?curl=pages/regulation/document_listing/document_listing_000319.jsp). © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. 46 CFR 162.039-7 - Procedure for listing and labeling.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., Marine Type § 162.039-7 Procedure for listing and labeling. (a) Manufacturers having models of..., and testing necessary for such listing and labeling. All costs in connection with the examinations...

  20. Building on IUCN regional red lists to produce lists of species of conservation priority: a model with Irish bees.

    PubMed

    Fitzpatrick, Una; Murray, Tomás E; Paxton, Robert J; Brown, Mark J F

    2007-10-01

    A World Conservation Union (IUCN) regional red list is an objective assessment of regional extinction risk and is not the same as a list of conservation priority species. Recent research reveals the widespread, but incorrect, assumption that IUCN Red List categories represent a hierarchical list of priorities for conservation action. We developed a simple eight-step priority-setting process and applied it to the conservation of bees in Ireland. Our model is based on the national red list but also considers the global significance of the national population; the conservation status at global, continental, and regional levels; key biological, economic, and societal factors; and is compatible with existing conservation agreements and legislation. Throughout Ireland, almost one-third of the bee fauna is threatened (30 of 100 species), but our methodology resulted in a reduced list of only 17 priority species. We did not use the priority species list to broadly categorize species to the conservation action required; instead, we indicated the individual action required for all threatened, near-threatened, and data-deficient species on the national red list based on the IUCN's conservation-actions template file. Priority species lists will strongly influence prioritization of conservation actions at national levels, but action should not be exclusive to listed species. In addition, all species on this list will not necessarily require immediate action. Our method is transparent, reproducible, and readily applicable to other taxa and regions.

Top