Sample records for represent important tools

  1. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  2. Update 76: Selected Recent Works in the Social Sciences.

    ERIC Educational Resources Information Center

    Pike, Mary L., Ed.; Lusignan, Louise, Ed.

    This is a selected bibliography of current reference and acquisition tools in the social sciences. The tools include sourcebooks, dictionaries, indexes, conference proceedings, special bibliographies, directories, research reports, and journals. Most citations represent works published since 1970 and new editions of important earlier works.…

  3. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    ERIC Educational Resources Information Center

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  4. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    USDA-ARS?s Scientific Manuscript database

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  5. Literacy Program. National Issues Forums Special Report.

    ERIC Educational Resources Information Center

    National Issues Forums, Dayton, OH.

    In the spring of 1988, 33 representatives from 20 institutions or organizations sponsoring National Issues Forum (NIF) literacy programs attended a national conference in Washington, D.C. Throughout the conference, representatives from the organizations sponsoring NIF literacy programs made statements on the importance of NIF as a tool for…

  6. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  7. EPA’s Experimental Stream Facility: Design and Research Supporting Watershed Management

    EPA Science Inventory

    The EPA’s Experimental Stream Facility (ESF) represents an important tool in research that is underway to further understanding of the relative importance of stream ecosystems and the services they provide for effective watershed management. The ESF is operated under the goal of ...

  8. Robust detection of rare species using environmental DNA: The importance of primer specificity

    Treesearch

    Taylor M. Wilcox; Kevin S. McKelvey; Michael K. Young; Stephen F. Jane; Winsor H. Lowe; Andrew R. Whiteley; Michael K. Schwartz

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probebased chemistries may represent a particularly powerful tool because of the method's sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence...

  9. Non-transferrin bound iron: a key role in iron overload and iron toxicity.

    PubMed

    Brissot, Pierre; Ropert, Martine; Le Lan, Caroline; Loréal, Olivier

    2012-03-01

    Besides transferrin iron, which represents the normal form of circulating iron, non-transferrin bound iron (NTBI) has been identified in the plasma of patients with various pathological conditions in which transferrin saturation is significantly elevated. To show that: i) NTBI is present not only during chronic iron overload disorders (hemochromatosis, transfusional iron overload) but also in miscellaneous diseases which are not primarily iron overloaded conditions; ii) this iron species represents a potentially toxic iron form due to its high propensity to induce reactive oxygen species and is responsible for cellular damage not only at the plasma membrane level but also towards different intracellular organelles; iii) the NTBI concept may be expanded to include intracytosolic iron forms which are not linked to ferritin, the major storage protein which exerts, at the cellular level, the same type of protective effect towards the intracellular environment as transferrin in the plasma. Plasma NTBI and especially labile plasma iron determinations represent a new important biological tool since elimination of this toxic iron species is a major therapeutic goal. The NTBI approach represents an important mechanistic concept for explaining cellular iron excess and toxicity and provides new important biochemical diagnostic tools. This article is part of a Special Issue entitled Transferrins: Molecular mechanisms of iron transport and disorders. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. CNTRO: A Semantic Web Ontology for Temporal Relation Inferencing in Clinical Narratives.

    PubMed

    Tao, Cui; Wei, Wei-Qi; Solbrig, Harold R; Savova, Guergana; Chute, Christopher G

    2010-11-13

    Using Semantic-Web specifications to represent temporal information in clinical narratives is an important step for temporal reasoning and answering time-oriented queries. Existing temporal models are either not compatible with the powerful reasoning tools developed for the Semantic Web, or designed only for structured clinical data and therefore are not ready to be applied on natural-language-based clinical narrative reports directly. We have developed a Semantic-Web ontology which is called Clinical Narrative Temporal Relation ontology. Using this ontology, temporal information in clinical narratives can be represented as RDF (Resource Description Framework) triples. More temporal information and relations can then be inferred by Semantic-Web based reasoning tools. Experimental results show that this ontology can represent temporal information in real clinical narratives successfully.

  11. Importance and pitfalls of molecular analysis to parasite epidemiology.

    PubMed

    Constantine, Clare C

    2003-08-01

    Molecular tools are increasingly being used to address questions about parasite epidemiology. Parasites represent a diverse group and they might not fit traditional population genetic models. Testing hypotheses depends equally on correct sampling, appropriate tool and/or marker choice, appropriate analysis and careful interpretation. All methods of analysis make assumptions which, if violated, make the results invalid. Some guidelines to avoid common pitfalls are offered here.

  12. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  13. The effect of stereochemistry on the biological activity of natural phytotoxins, fungicides, insecticides and herbicides.

    PubMed

    Evidente, Antonio; Cimmino, Alessio; Andolfi, Anna

    2013-02-01

    Phytotoxins are secondary microbial metabolites that play an essential role in the development of disease symptoms induced by fungi on host plants. Although phytotoxins can cause extensive-and in some cases devastating-damage to agricultural crops, they can also represent an important tool to develop natural herbicides when produced by fungi and plants to inhibit the growth and spread of weeds. An alternative strategy to biologically control parasitic plants is based on the use of plant and fungal metabolites, which stimulate seed germination in the absence of the host plant. Nontoxigenic fungi also produce bioactive metabolites with potential fungicide and insecticide activity, and could be applied for crop protection. All these metabolites represent important tools to develop eco-friendly pesticides. This review deals with the relationships between the biological activity of some phytotoxins, seed germination stimulants, fungicides and insecticides, and their stereochemistry. Copyright © 2012 Wiley Periodicals, Inc.

  14. Engagement and Empowerment Through Self-Service.

    PubMed

    Endriss, Jason

    2016-01-01

    Self-service tools represent the next frontier for leave and disability. This article discusses several critical com- ponents of a successful leave and disability self-service tool. If given the proper investment and thoughtfully designed, self-service tools have the potential to augment an organization's existing interaction channels, im- proving the employee experience while delivering efficiencies for an administrative model. In an operating en- vironment in which cost savings sometimes are at the expense of employee experience, such a win-win solution should not be taken lightly and, more importantly, should not be missed.

  15. Identification and characterization of miRNAs transcriptome in the South African abalone, Haliotis midae.

    PubMed

    Picone, Barbara; Rhode, Clint; Roodt-Wilding, Rouvay

    2017-02-01

    Aquatic animal diseases are one of the most important limitations to the growth of aquaculture. miRNAs represent an important class of small ncRNAs able to modulate host immune and stress responses. In Mollusca, a large phylum of invertebrates, miRNAs have been identified in several species. The current preliminary study identified known miRNAs from the South African abalone, Haliotis midae. The economic and ecological importance of abalone makes this species a suitable model for studying and understanding stress response in marine gastropods. Furthermore, the identification of miRNA, represents an alternative and powerful tool to combat infectious disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Biopathways representation and simulation on hybrid functional petri net.

    PubMed

    Matsuno, Hiroshi; Tanaka, Yukiko; Aoshima, Hitoshi; Doi, Atsushi; Matsui, Mika; Miyano, Satoru

    2011-01-01

    The following two matters should be resolved in order for biosimulation tools to be accepted by users in biology/medicine: (1) remove issues which are irrelevant to biological importance, and (2) allow users to represent biopathways intuitively and understand/manage easily the details of representation and simulation mechanism. From these criteria, we firstly define a novel notion of Petri net called Hybrid Functional Petri Net (HFPN). Then, we introduce a software tool, Genomic Object Net, for representing and simulating biopathways, which we have developed by employing the architecture of HFPN. In order to show the usefulness of Genomic Object Net for representing and simulating biopathways, we show two HFPN representations of gene regulation mechanisms of Drosophila melanogaster (fruit fly) circadian rhythm and apoptosis induced by Fas ligand. The simulation results of these biopathways are also correlated with biological observations. The software is available to academic users from http://www.GenomicObject.Net/.

  17. A computer-aided ECG diagnostic tool.

    PubMed

    Oweis, Rami; Hijazi, Lily

    2006-03-01

    Jordan lacks companies that provide local medical facilities with products that are of help in daily performed medical procedures. Because of this, the country imports most of these expensive products. Consequently, a local interest in producing such products has emerged and resulted in serious research efforts in this area. The main goal of this paper is to provide local (the north of Jordan) clinics with a computer-aided electrocardiogram (ECG) diagnostic tool in an attempt to reduce time and work demands for busy physicians especially in areas where only one general medicine doctor is employed and a bulk of cases are to be diagnosed. The tool was designed to help in detecting heart defects such as arrhythmias and heart blocks using ECG signal analysis depending on the time-domain representation, the frequency-domain spectrum, and the relationship between them. The application studied here represents a state of the art ECG diagnostic tool that was designed, implemented, and tested in Jordan to serve wide spectrum of population who are from poor families. The results of applying the tool on randomly selected representative sample showed about 99% matching with those results obtained at specialized medical facilities. Costs, ease of interface, and accuracy indicated the usefulness of the tool and its use as an assisting diagnostic tool.

  18. Scientists' Views about Communication Training

    ERIC Educational Resources Information Center

    Besley, John C.; Dudo, Anthony; Storksdieck, Martin

    2015-01-01

    This study assesses how scientists think about science communication training based on the argument that such training represents an important tool in improving the quality of interactions between scientists and the public. It specifically focuses on training related to five goals, including views about training to make science messages…

  19. Use of Gene Expression Changes in Blood to Elucidate Mechanistic Indicators of Childhood Asthma (MICA)

    EPA Science Inventory

    Risk assessment increasingly relies more heavily on mode of action, thus the identification of human bioindicators of disease becomes all the more important. Genomic methods represent a tool for both mode of action determination and bioindicator identification. The Mechanistic In...

  20. Materials science tools for regenerative medicine

    NASA Astrophysics Data System (ADS)

    Richardson, Wade Nicholas

    Regenerative therapies originating from recent technological advances in biology could revolutionize medicine in the coming years. In particular, the advent of human pluripotent stem cells (hPSCs), with their ability to become any cell in the adult body, has opened the door to an entirely new way of treating disease. However, currently these medical breakthroughs remain only a promise. To make them a reality, new tools must be developed to surmount the new technical hurdles that have arisen from dramatic departure from convention that this field represents. The collected work presented in this dissertation covers several projects that seek to apply the skills and knowledge of materials science to this tool synthesizing effort. The work is divided into three chapters. The first deals with our work to apply Raman spectroscopy, a tool widely used for materials characterization, to degeneration in cartilage. We have shown that Raman can effectively distinguish the matrix material of healthy and diseased tissue. The second area of work covered is the development of a new confocal image analysis for studying hPSC colonies that are chemical confined to uniform growth regions. This tool has important application in understanding the heterogeneity that may slow the development of hPSC -based treatment, as well as the use of such confinement in the eventually large-scale manufacture of hPSCs for therapeutic use. Third, the use of structural templating in tissue engineering scaffolds is detailed. We have utilized templating to tailor scaffold structures for engineering of constructs mimicking two tissues: cartilage and lung. The work described here represents several important early steps towards large goals in regenerative medicine. These tools show a great deal of potential for accelerating progress in this field that seems on the cusp of helping a great many people with otherwise incurable disease.

  1. Single-Subject Experimental Design for Evidence-Based Practice

    ERIC Educational Resources Information Center

    Byiers, Breanne J.; Reichle, Joe; Symons, Frank J.

    2012-01-01

    Purpose: Single-subject experimental designs (SSEDs) represent an important tool in the development and implementation of evidence-based practice in communication sciences and disorders. The purpose of this article is to review the strategies and tactics of SSEDs and their application in speech-language pathology research. Method: The authors…

  2. Secondary Data Analysis: An Important Tool for Addressing Developmental Questions

    ERIC Educational Resources Information Center

    Greenhoot, Andrea Follmer; Dowsett, Chantelle J.

    2012-01-01

    Existing data sets can be an efficient, powerful, and readily available resource for addressing questions about developmental science. Many of the available databases contain hundreds of variables of interest to developmental psychologists, track participants longitudinally, and have representative samples. In this article, the authors discuss the…

  3. Using Diagrams as Tools for the Solution of Non-Routine Mathematical Problems

    ERIC Educational Resources Information Center

    Pantziara, Marilena; Gagatsis, Athanasios; Elia, Iliada

    2009-01-01

    The Mathematics education community has long recognized the importance of diagrams in the solution of mathematical problems. Particularly, it is stated that diagrams facilitate the solution of mathematical problems because they represent problems' structure and information (Novick & Hurley, 2001; Diezmann, 2005). Novick and Hurley were the first…

  4. An interactive GIS based tool on Chinese history and its topography

    NASA Astrophysics Data System (ADS)

    Konda, Ashish Reddy

    The aim of the thesis is to demonstrate how China was attacked by the foreign powers, the rise and fall of the empires, the border conflicts with India, Russia, Vietnam and territorial disputes in South China Sea. This thesis is focused on creating a GIS tool showcasing the modern Chinese history, which includes the major wars fought during that period. This tool is developed using the features of Google Maps that shows the location of the wars. The topography of China is also represented on the interactive Google Map by creating layers for rivers, mountain ranges and deserts. The provinces with highest population are also represented on the Google Map with circles. The application also shows the historical events in chronological order using a timeline feature. This has been implemented using JQuery, JavaScript, HTML5 and CSS. Chinese culture and biographies of important leaders are also included in this thesis, which is embedded with pictures and videos.

  5. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  6. Robust retention and transfer of tool construction techniques in chimpanzees (Pan troglodytes).

    PubMed

    Vale, Gill L; Flynn, Emma G; Pender, Lydia; Price, Elizabeth; Whiten, Andrew; Lambeth, Susan P; Schapiro, Steven J; Kendal, Rachel L

    2016-02-01

    Long-term memory can be critical to a species' survival in environments with seasonal and even longer-term cycles of resource availability. The present, longitudinal study investigated whether complex tool behaviors used to gain an out-of-reach reward, following a hiatus of about 3 years and 7 months since initial experiences with a tool use task, were retained and subsequently executed more quickly by experienced than by naïve chimpanzees. Ten of the 11 retested chimpanzees displayed impressive long-term procedural memory, creating elongated tools using the same methods employed years previously, either combining 2 tools or extending a single tool. The complex tool behaviors were also transferred to a different task context, showing behavioral flexibility. This represents some of the first evidence for appreciable long-term procedural memory, and improvements in the utility of complex tool manufacture in chimpanzees. Such long-term procedural memory and behavioral flexibility have important implications for the longevity and transmission of behavioral traditions. (c) 2016 APA, all rights reserved).

  7. Robust Retention and Transfer of Tool Construction Techniques in Chimpanzees (Pan troglodytes)

    PubMed Central

    Vale, Gill L.; Flynn, Emma G.; Pender, Lydia; Price, Elizabeth; Whiten, Andrew; Lambeth, Susan P.; Schapiro, Steven J.; Kendal, Rachel L.

    2016-01-01

    Long-term memory can be critical to a species’ survival in environments with seasonal and even longer-term cycles of resource availability. The present, longitudinal study investigated whether complex tool behaviors used to gain an out-of-reach reward, following a hiatus of about 3 years and 7 months since initial experiences with a tool use task, were retained and subsequently executed more quickly by experienced than by naïve chimpanzees. Ten of the 11 retested chimpanzees displayed impressive long-term procedural memory, creating elongated tools using the same methods employed years previously, either combining 2 tools or extending a single tool. The complex tool behaviors were also transferred to a different task context, showing behavioral flexibility. This represents some of the first evidence for appreciable long-term procedural memory, and improvements in the utility of complex tool manufacture in chimpanzees. Such long-term procedural memory and behavioral flexibility have important implications for the longevity and transmission of behavioral traditions. PMID:26881941

  8. Books and National Development. Seminar Report April 27-29, 1968, Academy House, Seoul.

    ERIC Educational Resources Information Center

    Korean Publishers Association, Seoul.

    Representatives from Korea, China, Indonesia, Japan, Thailand, the United States and Southeast Asian Ministers of Education (SEAMES) attended an international seminar to reaffirm the importance of books as national development tools, to seek measures for having it reflected in the national policies and to promote international cooperation in book…

  9. Employer Branding - Source of Competitiveness of the Industrial Plants

    NASA Astrophysics Data System (ADS)

    Babčanová, Dagmar; Babčan, Miroslav; Odlerová, Eva

    2010-01-01

    The paper deals with the concept of employer branding, which is very important to follow, as an employer brand represents the core values of an organization. Organizations considered good employers have a strong identity and a positive image in the marketplace. To be successful, organizations need to attract the employee market. Marketing tools associated with Brand Management have been applied by the HR (Human Resources) in order to attract, engage and retain employees in the same way as marketing applies such tools to attract and retain customers.

  10. A high-level object-oriented model for representing relationships in an electronic medical record.

    PubMed Central

    Dolin, R. H.

    1994-01-01

    The importance of electronic medical records to improve the quality and cost-effectiveness of medical care continues to be realized. This growing importance has spawned efforts at defining the structure and content of medical data, which is heterogeneous, highly inter-related, and complex. Computer-assisted data modeling tools have greatly facilitated the process of representing medical data, however the complex inter-relationships of medical information can result in data models that are large and cumbersome to manipulate and view. This report presents a high-level object-oriented model for representing the relationships between objects or entities that might exist in an electronic medical record. By defining the relationship between objects at a high level and providing for inheritance, this model enables relating any medical entity to any other medical entity, even though the relationships were not directly specified or known during data model design. PMID:7949981

  11. Modeling greenhouse gas emissions from dairy farms.

    PubMed

    Rotz, C Alan

    2017-11-15

    Dairy farms have been identified as an important source of greenhouse gas emissions. Within the farm, important emissions include enteric CH 4 from the animals, CH 4 and N 2 O from manure in housing facilities during long-term storage and during field application, and N 2 O from nitrification and denitrification processes in the soil used to produce feed crops and pasture. Models using a wide range in level of detail have been developed to represent or predict these emissions. They include constant emission factors, variable process-related emission factors, empirical or statistical models, mechanistic process simulations, and life cycle assessment. To fully represent farm emissions, models representing the various emission sources must be integrated to capture the combined effects and interactions of all important components. Farm models have been developed using relationships across the full scale of detail, from constant emission factors to detailed mechanistic simulations. Simpler models, based upon emission factors and empirical relationships, tend to provide better tools for decision support, whereas more complex farm simulations provide better tools for research and education. To look beyond the farm boundaries, life cycle assessment provides an environmental accounting tool for quantifying and evaluating emissions over the full cycle, from producing the resources used on the farm through processing, distribution, consumption, and waste handling of the milk and dairy products produced. Models are useful for improving our understanding of farm processes and their interacting effects on greenhouse gas emissions. Through better understanding, they assist in the development and evaluation of mitigation strategies for reducing emissions and improving overall sustainability of dairy farms. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).

  12. PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri

    2014-01-01

    Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961

  13. Assessing the Operational Effectiveness of a Small Surface Combat Ship in an Anti-Surface Warfare Environment

    DTIC Science & Technology

    2013-06-01

    realistically representing the world in a simulation environment. A screenshot of the combat model used for this research is shown below. There are six...changes in use of technology (Ryan & Jons, 1992). Cost effectiveness and operational effectiveness are important, and it is extremely hard to achieve...effectiveness of ships using simulation and analytical models, to create a ship synthesis model, and most importantly, to develop decision making tools

  14. Observing and understanding the ultrafast photochemistry in small molecules: applications to sunscreens.

    PubMed

    Baker, Lewis A; Stavros, Vasilios G

    2016-09-01

    In this review, we discuss the importance of biological and artificial photoprotection against overexposure to harmful ultraviolet radiation. Transient electronic and transient vibrational absorption spectroscopies are highlighted as important tools in understanding the energy transfer in small molecules, with a focus on the application to commercial sunscreens with representative examples given. Oxybenzone, a common ingredient in commercial sunscreens and sinapoyl malate, a biological sunscreen in plant leaves are presented as case studies.

  15. Empathic engineering: helping deliver dignity through design

    PubMed Central

    Hosking, Ian; Cornish, Katie; Bradley, Mike; Clarkson, P. John

    2015-01-01

    Abstract Dignity is a key value within healthcare. Technology is also recognized as being a fundamental part of healthcare delivery, but also a potential cause of dehumanization of the patient. Therefore, understanding how medical devices can be designed to help deliver dignity is important. This paper explores the role of empathy tools as a way of engendering empathy in engineers and designers to enable them to design for dignity. A framework is proposed that makes the link between empathy tools and outcomes of feelings of dignity. It represents a broad systems view that provides a structure for reviewing the evidence for the efficacy of empathy tools and also how dignity can be systematically understood for particular medical devices. PMID:26453036

  16. Electromagnetic Launch Vehicle Fairing and Acoustic Blanket Model of Received Power Using FEKO

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to sensitive spacecraft. This paper employees the Multilevel Fast Multipole Method (MLFMM) feature of a commercial electromagnetic tool to model the fairing electromagnetic environment in the presence of an internal transmitter. This work is an extension of the perfect electric conductor model that was used to represent the bare aluminum internal fairing cavity. This fairing model includes typical acoustic blanketing commonly used in vehicle fairings. Representative material models within FEKO were successfully used to simulate the test case.

  17. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu Dianliang; Zhu Hongmin; Shanghai Key Laboratory of Advance Manufacturing Environment

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools andmore » equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.« less

  18. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    PubMed

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  19. Ergonomics for the inclusion of older workers in the knowledge workforce and a guidance tool for designers.

    PubMed

    Gonzalez, I; Morer, P

    2016-03-01

    The ageing of the population and the inverted population pyramid is bringing important changes to society as a whole. These changes are associated with the inclusion of an older workforce in knowledge work and the challenge they represent in adapting the work environment accordingly. In order to approach a more universal design of the work environment, industrial designers need support from user-sensitive inclusive design studies. While there are plenty of guidelines and tools containing relevant information, there is a need to develop more appropriate tools for Industrial Designers that cover the initial phase of the design process. This study provides a review of the available tools and guidelines and proposes a theoretical framework intended for developing a design guidance tool for inclusive workstation design. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. An Effective Educational Tool: Construction Kits for Fun and Meaningful Learning

    ERIC Educational Resources Information Center

    Somyürek, Sibel

    2015-01-01

    The integration of robotics in education is still relatively new and represents an important advance in education practices. So, this paper aims to share the results from the perspectives of both students and trainers in an experimental case research in which LEGO Mindstorms construction kits were used. Sixty-two students between the ages of 8 and…

  1. The Development and Application of Policy-Based Tools for Institutional Green Buildings

    ERIC Educational Resources Information Center

    Cupido, Anthony F.

    2010-01-01

    In 2008, APPA forwarded a Web-based survey on the author's behalf to all designated representatives of APPA member institutions. The purpose of the survey was to determine if institutional policies are an important criterion for an institution's sustainable building practices and the use of Leadership in Energy and Environmental Design (LEED[R]).…

  2. Narratives for Obesity: Effects of Weight Loss and Attribution on Empathy and Policy Support

    ERIC Educational Resources Information Center

    Thibodeau, Paul H.; Uri, Rachel; Thompson, Briana; Flusberg, Stephen J.

    2017-01-01

    Despite an urgent need to address the issue of obesity, little research has examined the psychological factors that influence support for obesity-related policy initiatives, which represent an important tool for addressing this complex health issue. In the present study, we measured the degree to which people supported obesity-related policy…

  3. A Computer System to Rate the Variety of Color in Drawings

    ERIC Educational Resources Information Center

    Kim, Seong-in; Hameed, Ibrahim A.

    2009-01-01

    For mental health professionals, art assessment is a useful tool for patient evaluation and diagnosis. Consideration of various color-related elements is important in art assessment. This correlational study introduces the concept of variety of color as a new color-related element of an artwork. This term represents a comprehensive use of color,…

  4. Implementation of channel-routing routines in the Water Erosion Prediction Project (WEPP) model

    Treesearch

    Li Wang; Joan Q. Wu; William J. Elliott; Shuhui Dun; Sergey Lapin; Fritz R. Fiedler; Dennis C. Flanagan

    2010-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based, continuous-simulation, watershed hydrology and erosion model. It is an important tool for water erosion simulation owing to its unique functionality in representing diverse landuse and management conditions. Its applicability is limited to relatively small watersheds since its current version does...

  5. Using the Leitz LMS 2000 for monitoring and improvement of an e-beam

    NASA Astrophysics Data System (ADS)

    Blaesing-Bangert, Carola; Roeth, Klaus-Dieter; Ogawa, Yoichi

    1994-11-01

    Kaizen--a continuously improving--is a philosophy lived in Japan which is also becoming more and more important in Western companies. To implement this philosophy in the semiconductor industry, a high performance metrology tool is essential to determine the status of production quality periodically. An important prerequisite for statistical process control is the high stability of the metrology tool over several months or years; the tool-induced shift should be as small as possible. The pattern placement metrology tool Leitz LMS 2000 has been used in a major European mask house for several years now to qualify masks within the tightest specifications and to monitor the MEBES III and its cassettes. The mask shop's internal specification for the long term repeatability of the pattern placement metrology tool is 19 nm instead of 42 nm as specified by the supplier of the tool. Then the process capability of the LMS 2000 over 18 months is represented by an average cpk value of 2.8 for orthogonality, 5.2 for x-scaling, and 3.0 for y-scaling. The process capability of the MEBES III and its cassettes was improved in the past years. For instance, 100% of the masks produced with a process tolerance of +/- 200 nm are now within this limit.

  6. Determining the relative importance of figures in journal articles to find representative images

    NASA Astrophysics Data System (ADS)

    Müller, Henning; Foncubierta-Rodríguez, Antonio; Lin, Chang; Eggel, Ivan

    2013-03-01

    When physicians are searching for articles in the medical literature, images of the articles can help determining relevance of the article content for a specific information need. The visual image representation can be an advantage in effectiveness (quality of found articles) and also in efficiency (speed of determining relevance or irrelevance) as many articles can likely be excluded much quicker by looking at a few representative images. In domains such as medical information retrieval, allowing to determine relevance quickly and accurately is an important criterion. This becomes even more important when small interfaces are used as it is frequently the case on mobile phones and tablets to access scientific data whenever information needs arise. In scientific articles many figures are used and particularly in the biomedical literature only a subset may be relevant for determining the relevance of a specific article to an information need. In many cases clinical images can be seen as more important for visual appearance than graphs or histograms that require looking at the context for interpretation. To get a clearer idea of image relevance in articles, a user test with a physician was performed who classified images of biomedical research articles into categories of importance that can subsequently be used to evaluate algorithms that automatically select images as representative examples. The manual sorting of images of 50 journal articles of BioMedCentral with each containing more than 8 figures by importance also allows to derive several rules that determine how to choose images and how to develop algorithms for choosing the most representative images of specific texts. This article describes the user tests and can be a first important step to evaluate automatic tools to select representative images for representing articles and potentially also images in other contexts, for example when representing patient records or other medical concepts when selecting images to represent RadLex terms in tutorials or interactive interfaces for example. This can help to make the image retrieval process more efficient and effective for physicians.

  7. Time series analysis of tool wear in sheet metal stamping using acoustic emission

    NASA Astrophysics Data System (ADS)

    Vignesh Shanbhag, V.; Pereira, P. Michael; Rolfe, F. Bernard; Arunachalam, N.

    2017-09-01

    Galling is an adhesive wear mode that often affects the lifespan of stamping tools. Since stamping tools represent significant economic cost, even a slight improvement in maintenance cost is of high importance for the stamping industry. In other manufacturing industries, online tool condition monitoring has been used to prevent tool wear-related failure. However, monitoring the acoustic emission signal from a stamping process is a non-trivial task since the acoustic emission signal is non-stationary and non-transient. There have been numerous studies examining acoustic emissions in sheet metal stamping. However, very few have focused in detail on how the signals change as wear on the tool surface progresses prior to failure. In this study, time domain analysis was applied to the acoustic emission signals to extract features related to tool wear. To understand the wear progression, accelerated stamping tests were performed using a semi-industrial stamping setup which can perform clamping, piercing, stamping in a single cycle. The time domain features related to stamping were computed for the acoustic emissions signal of each part. The sidewalls of the stamped parts were scanned using an optical profilometer to obtain profiles of the worn part, and they were qualitatively correlated to that of the acoustic emissions signal. Based on the wear behaviour, the wear data can be divided into three stages: - In the first stage, no wear is observed, in the second stage, adhesive wear is likely to occur, and in the third stage severe abrasive plus adhesive wear is likely to occur. Scanning electron microscopy showed the formation of lumps on the stamping tool, which represents galling behavior. Correlation between the time domain features of the acoustic emissions signal and the wear progression identified in this study lays the basis for tool diagnostics in stamping industry.

  8. Systematic review and meta-analysis: tools for the information age.

    PubMed

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Intellectual Property: a powerful tool to develop biotech research

    PubMed Central

    Giugni, Diego; Giugni, Valter

    2010-01-01

    Summary Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. PMID:21255349

  10. Potential application of influence diagram as a risk assessment tool in Brownfields sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Attoh-Okine, N.O.

    Brownfields are vacant, abandoned, or underutilized commercial and industrial sites and facilities where real or perceived environmental contamination is an obstacle to redevelopment. These sites are vacant because they often do not meet the strict remediation requirements of the Superfund Law. The sites are accessible locations with much of the infrastructure, albeit deteriorated, in place. Thus they also represent an opportunity to slow down suburban and rural sprawl. As a liability, the concern stems from the environment liability of both known and unknown site contamination. Influence diagrams are tools used to represent complex decision problems based on incomplete and uncertainmore » information from a variety of sources. The influence diagrams can be used to divide all uncertainties (Brownfields site infrastructure impact assessment) into subfactors until the level has been reached at which intuitive functions are most effective. Given the importance of uncertainties and the utilities of the Brownfields infrastructure, the use of influence diagrams seem more appropriate for representing and solving risks involved in Brownfields infrastructure assessment.« less

  11. From Structure to Function: A Comprehensive Compendium of Tools to Unveil Protein Domains and Understand Their Role in Cytokinesis.

    PubMed

    Rincon, Sergio A; Paoletti, Anne

    2016-01-01

    Unveiling the function of a novel protein is a challenging task that requires careful experimental design. Yeast cytokinesis is a conserved process that involves modular structural and regulatory proteins. For such proteins, an important step is to identify their domains and structural organization. Here we briefly discuss a collection of methods commonly used for sequence alignment and prediction of protein structure that represent powerful tools for the identification homologous domains and design of structure-function approaches to test experimentally the function of multi-domain proteins such as those implicated in yeast cytokinesis.

  12. Data to knowledge: how to get meaning from your result.

    PubMed

    Berman, Helen M; Gabanyi, Margaret J; Groom, Colin R; Johnson, John E; Murshudov, Garib N; Nicholls, Robert A; Reddy, Vijay; Schwede, Torsten; Zimmerman, Matthew D; Westbrook, John; Minor, Wladek

    2015-01-01

    Structural and functional studies require the development of sophisticated 'Big Data' technologies and software to increase the knowledge derived and ensure reproducibility of the data. This paper presents summaries of the Structural Biology Knowledge Base, the VIPERdb Virus Structure Database, evaluation of homology modeling by the Protein Model Portal, the ProSMART tool for conformation-independent structure comparison, the LabDB 'super' laboratory information management system and the Cambridge Structural Database. These techniques and technologies represent important tools for the transformation of crystallographic data into knowledge and information, in an effort to address the problem of non-reproducibility of experimental results.

  13. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology

    PubMed Central

    Latendresse, Mario; Paley, Suzanne M.; Krummenacker, Markus; Ong, Quang D.; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M.; Caspi, Ron

    2016-01-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. PMID:26454094

  14. RHydro - Hydrological models and tools to represent and analyze hydrological data in R

    NASA Astrophysics Data System (ADS)

    Reusser, Dominik; Buytaert, Wouter

    2010-05-01

    In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. Other scientific disciplines such as mathematics and physics have benefited significantly from such an approach with freely available implementations for many routines. As an example, hydrological libraries could contain: Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data Data consistency checks Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. The design is such, that a definition of import scripts for additional models is sufficient to have access to the full set of evaluation and visualization tools.

  15. Pharmaceutical representatives' beliefs and practices about their professional practice: a study in Sudan.

    PubMed

    Idris, K M; Mustafa, A F; Yousif, M A

    2012-08-01

    Pharmaceutical representatives are an important promotional tool for pharmaceutical companies. This cross-sectional, exploratory study aimed to determine pharmaceutical representatives' beliefs and practices about their professional practice in Sudan. A random sample of 160 pharmaceutical representatives were interviewed using a pretested questionnaire. The majority were male (84.4%) and had received training in professional sales skills (86.3%) and about the products being promoted (82.5%). Only 65.6% agreed that they provided full and balanced information about products. Not providing balanced information was attributed by 23.1% to doctors' lack of time. However, 28.1% confessed they sometimes felt like hiding unfavourable information, 21.9% were sometimes or always inclined to give untrue information to make sales and 66.9% considered free gifts as ethically acceptable. More attention is needed to dissemination of ethical codes of conduct and training about the ethics of drug promotion for pharmaceutical representatives in Sudan.

  16. Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2011-01-01

    Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found "not" to have modeled…

  17. Home-Based vs. Laboratory-Based Practical Activities in the Learning of Human Physiology: The Perception of Students

    ERIC Educational Resources Information Center

    Neves, Ben-Hur S.; Altermann, Caroline; Gonçalves, Rithiele; Lara, Marcus Vinícius; Mello-Carpes, Pâmela B.

    2017-01-01

    Different tools have been used to facilitate the teaching and learning process in different areas of knowledge. Practical activities represent a form of teaching in which students not only listen to theoretical concepts but are also able to link theory and practice, and their importance in the biological sciences is notable. Sometimes, however,…

  18. Lasers in automobile production

    NASA Astrophysics Data System (ADS)

    Pizzi, P.

    There is a trend in mechanical equipment to replace complicated mechanical components with electronics, especially microprocessors, laser technology represents an important new tool. The effects of laser technology can be seen in production systems concerned with cutting, welding, heat treatment, and the alloying of mechanical components. Applications in the automobile industry today are not very widespread and are concerned essentially with welding and cutting.

  19. An Interactive Iterative Method for Electronic Searching of Large Literature Databases

    ERIC Educational Resources Information Center

    Hernandez, Marco A.

    2013-01-01

    PubMed® is an on-line literature database hosted by the U.S. National Library of Medicine. Containing over 21 million citations for biomedical literature--both abstracts and full text--in the areas of the life sciences, behavioral studies, chemistry, and bioengineering, PubMed® represents an important tool for researchers. PubMed® searches return…

  20. Is the Oxygen Atom Static or Dynamic? The Effect of Generating Animations on Students' Mental Models of Atomic Structure

    ERIC Educational Resources Information Center

    Akaygun, Sevil

    2016-01-01

    Visualizing the chemical structure and dynamics of particles has been challenging for many students; therefore, various visualizations and tools have been used in chemistry education. For science educators, it has been important to understand how students visualize and represent particular phenomena--i.e., their mental models-- to design more…

  1. Assessment Innovation and Student Experience: A New Assessment Challenge and Call for a Multi-Perspective Approach to Assessment Research

    ERIC Educational Resources Information Center

    Bevitt, Sheena

    2015-01-01

    The impact of innovative assessment on student experience in higher education is a neglected research topic. This represents an important gap in the literature-given debate around the marketisation of higher education, international focus on student satisfaction measurement tools and political calls to put students at the heart of higher education…

  2. How well does the current metabolizable protein system account for protein supply and demand of beef females within extensive western grazing systems?

    USDA-ARS?s Scientific Manuscript database

    Extensive western beef livestock production systems within the Southern and Northern Plains and Pacific West combined represent 60% (approximately 17.5 million) of total beef cows in the United States. The beef NRC is an important tool and excellent resource for both professionals and producers to u...

  3. Rudiments of curvelet with applications

    NASA Astrophysics Data System (ADS)

    Zahra, Noor e.

    2012-07-01

    Curvelet transform is now a days a favored tool for image processing. Edges are an important part of an image and usually they are not straight lines. Curvelet prove to be very efficient in representing curve like edges. In this chapter application of curvelet is shown with some examples like seismic wave analysis, oil exploration, fingerprint identification and biomedical images like mammography and MRI.

  4. Development strategy and process models for phased automation of design and digital manufacturing electronics

    NASA Astrophysics Data System (ADS)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  5. Impact Of The Material Variability On The Stamping Process: Numerical And Analytical Analysis

    NASA Astrophysics Data System (ADS)

    Ledoux, Yann; Sergent, Alain; Arrieux, Robert

    2007-05-01

    The finite element simulation is a very useful tool in the deep drawing industry. It is used more particularly for the development and the validation of new stamping tools. It allows to decrease cost and time for the tooling design and set up. But one of the most important difficulties to have a good agreement between the simulation and the real process comes from the definition of the numerical conditions (mesh, punch travel speed, limit conditions,…) and the parameters which model the material behavior. Indeed, in press shop, when the sheet set changes, often a variation of the formed part geometry is observed according to the variability of the material properties between these different sets. This last parameter represents probably one of the main source of process deviation when the process is set up. That's why it is important to study the influence of material data variation on the geometry of a classical stamped part. The chosen geometry is an omega shaped part because of its simplicity and it is representative one in the automotive industry (car body reinforcement). Moreover, it shows important springback deviations. An isotropic behaviour law is assumed. The impact of the statistical deviation of the three law coefficients characterizing the material and the friction coefficient around their nominal values is tested. A Gaussian distribution is supposed and their impact on the geometry variation is studied by FE simulation. An other approach is envisaged consisting in modeling the process variability by a mathematical model and then, in function of the input parameters variability, it is proposed to define an analytical model which leads to find the part geometry variability around the nominal shape. These two approaches allow to predict the process capability as a function of the material parameter variability.

  6. Nursing Minimum Data Set Based on EHR Archetypes Approach.

    PubMed

    Spigolon, Dandara N; Moro, Cláudia M C

    2012-01-01

    The establishment of a Nursing Minimum Data Set (NMDS) can facilitate the use of health information systems. The adoption of these sets and represent them based on archetypes are a way of developing and support health systems. The objective of this paper is to describe the definition of a minimum data set for nursing in endometriosis represent with archetypes. The study was divided into two steps: Defining the Nursing Minimum Data Set to endometriosis, and Development archetypes related to the NMDS. The nursing data set to endometriosis was represented in the form of archetype, using the whole perception of the evaluation item, organs and senses. This form of representation is an important tool for semantic interoperability and knowledge representation for health information systems.

  7. Nursing Minimum Data Set Based on EHR Archetypes Approach

    PubMed Central

    Spigolon, Dandara N.; Moro, Cláudia M.C.

    2012-01-01

    The establishment of a Nursing Minimum Data Set (NMDS) can facilitate the use of health information systems. The adoption of these sets and represent them based on archetypes are a way of developing and support health systems. The objective of this paper is to describe the definition of a minimum data set for nursing in endometriosis represent with archetypes. The study was divided into two steps: Defining the Nursing Minimum Data Set to endometriosis, and Development archetypes related to the NMDS. The nursing data set to endometriosis was represented in the form of archetype, using the whole perception of the evaluation item, organs and senses. This form of representation is an important tool for semantic interoperability and knowledge representation for health information systems. PMID:24199126

  8. SpineCreator: a Graphical User Interface for the Creation of Layered Neural Models.

    PubMed

    Cope, A J; Richmond, P; James, S S; Gurney, K; Allerton, D J

    2017-01-01

    There is a growing requirement in computational neuroscience for tools that permit collaborative model building, model sharing, combining existing models into a larger system (multi-scale model integration), and are able to simulate models using a variety of simulation engines and hardware platforms. Layered XML model specification formats solve many of these problems, however they are difficult to write and visualise without tools. Here we describe a new graphical software tool, SpineCreator, which facilitates the creation and visualisation of layered models of point spiking neurons or rate coded neurons without requiring the need for programming. We demonstrate the tool through the reproduction and visualisation of published models and show simulation results using code generation interfaced directly into SpineCreator. As a unique application for the graphical creation of neural networks, SpineCreator represents an important step forward for neuronal modelling.

  9. Tool compounds robustly increase turnover of an artificial substrate by glucocerebrosidase in human brain lysates.

    PubMed

    Berger, Zdenek; Perkins, Sarah; Ambroise, Claude; Oborski, Christine; Calabrese, Matthew; Noell, Stephen; Riddell, David; Hirst, Warren D

    2015-01-01

    Mutations in glucocerebrosidase (GBA1) cause Gaucher disease and also represent a common risk factor for Parkinson's disease and Dementia with Lewy bodies. Recently, new tool molecules were described which can increase turnover of an artificial substrate 4MUG when incubated with mutant N370S GBA1 from human spleen. Here we show that these compounds exert a similar effect on the wild-type enzyme in a cell-free system. In addition, these tool compounds robustly increase turnover of 4MUG by GBA1 derived from human cortex, despite substantially lower glycosylation of GBA1 in human brain, suggesting that the degree of glycosylation is not important for compound binding. Surprisingly, these tool compounds failed to robustly alter GBA1 turnover of 4MUG in the mouse brain homogenate. Our data raise the possibility that in vivo models with humanized glucocerebrosidase may be needed for efficacy assessments of such small molecules.

  10. Hydrological modelling in forested systems | Science ...

    EPA Pesticide Factsheets

    This chapter provides a brief overview of forest hydrology modelling approaches for answering important global research and management questions. Many hundreds of hydrological models have been applied globally across multiple decades to represent and predict forest hydrological processes. The focus of this chapter is on process-based models and approaches, specifically 'forest hydrology models'; that is, physically based simulation tools that quantify compartments of the forest hydrological cycle. Physically based models can be considered those that describe the conservation of mass, momentum and/or energy. The purpose of this chapter is to provide a brief overview of forest hydrology modeling approaches for answering important global research and management questions. The focus of this chapter is on process-based models and approaches, specifically “forest hydrology models”, i.e., physically-based simulation tools that quantify compartments of the forest hydrological cycle.

  11. The Promise of Whole Genome Pathogen Sequencing for the Molecular Epidemiology of Emerging Aquaculture Pathogens

    PubMed Central

    Bayliss, Sion C.; Verner-Jeffreys, David W.; Bartie, Kerry L.; Aanensen, David M.; Sheppard, Samuel K.; Adams, Alexandra; Feil, Edward J.

    2017-01-01

    Aquaculture is the fastest growing food-producing sector, and the sustainability of this industry is critical both for global food security and economic welfare. The management of infectious disease represents a key challenge. Here, we discuss the opportunities afforded by whole genome sequencing of bacterial and viral pathogens of aquaculture to mitigate disease emergence and spread. We outline, by way of comparison, how sequencing technology is transforming the molecular epidemiology of pathogens of public health importance, emphasizing the importance of community-oriented databases and analysis tools. PMID:28217117

  12. Understanding Preferences for Treatment After Hypothetical First-Time Anterior Shoulder Dislocation: Surveying an Online Panel Utilizing a Novel Shared Decision-Making Tool.

    PubMed

    Streufert, Ben; Reed, Shelby D; Orlando, Lori A; Taylor, Dean C; Huber, Joel C; Mather, Richard C

    2017-03-01

    Although surgical management of a first-time anterior shoulder dislocation (FTASD) can reduce the risk of recurrent dislocation, other treatment characteristics, costs, and outcomes are important to patients considering treatment options. While patient preferences, such as those elicited by conjoint analysis, have been shown to be important in medical decision-making, the magnitudes or effects of patient preferences in treating an FTASD are unknown. To test a novel shared decision-making tool after sustained FTASD. Specifically measured were the following: (1) importance of aspects of operative versus nonoperative treatment, (2) respondents' agreement with results generated by the tool, (3) willingness to share these results with physicians, and (4) association of results with choice of treatment after FTASD. Cross-sectional study; Level of evidence, 3. A tool was designed and tested using members of Amazon Mechanical Turk, an online panel. The tool included an adaptive conjoint analysis exercise, a method to understand individuals' perceived importance of the following attributes of treatment: (1) chance of recurrent dislocation, (2) cost, (3) short-term limits on shoulder motion, (4) limits on participation in high-risk activities, and (5) duration of physical therapy. Respondents then chose between operative and nonoperative treatment for hypothetical shoulder dislocation. Overall, 374 of 501 (75%) respondents met the inclusion criteria, of which most were young, active males; one-third reported prior dislocation. From the conjoint analysis, the importance of recurrent dislocation and cost of treatment were the most important attributes. A substantial majority agreed with the tool's ability to generate representative preferences and indicated that they would share these preferences with their physician. Importance of recurrence proved significantly predictive of respondents' treatment choices, independent of sex or age; however, activity level was important to previous dislocators. A total of 125 (55%) males and 33 (23%) females chose surgery after FTASD, as did 37% of previous dislocators compared with 45% of nondislocators. When given thorough information about the risks and benefits, respondents had strong preferences for operative treatment after an FTASD. Respondents agreed with the survey results and wanted to share the information with providers. Recurrence was the most important attribute and played a role in decisions about treatment.

  13. Target loads of atmospheric sulfur deposition for the protection and recovery of acid-sensitive streams in the Southern Blue Ridge Province

    Treesearch

    Timothy Sullivan; Bernard Cosby; William Jackson

    2011-01-01

    An important tool in the evaluation of acidification damage to aquatic and terrestrial ecosystems is the critical load (CL), which represents the steady-state level of acidic deposition below which ecological damage would not be expected to occur, according to current scientific understanding. A deposition load intended to be protective of a specified resource...

  14. Upper Rio Grande water operations model: A tool for enhanced system management

    Treesearch

    Gail Stockton; D. Michael Roark

    1999-01-01

    The Upper Rio Grande Water Operations Model (URGWOM) under development through a multi-agency effort has demonstrated capability to represent the physical river/reservoir system, to track and account for Rio Grande flows and imported San Juan flows, and to forecast flows at various points in the system. Testing of the Rio Chama portion of the water operations model was...

  15. Afghanistan: Reconstituting a Collapsed State

    DTIC Science & Technology

    2005-04-01

    tool in Afghanistan’s deeply religious culture. Persuading the EU to relax its Common Agricultural Policy to accept Afghan agricultural products...Afghanistan, representing 95 percent of its heroin consumption, it is in the EU’s best interests to relax some provisions of its Common Agricultural ... Policy (CAP), permitting Afghan agricultural products into the EU.45 In this case, the EU could collaborate with Afghanistan 10 to import a high

  16. Benchmarking CRISPR on-target sgRNA design.

    PubMed

    Yan, Jifang; Chuai, Guohui; Zhou, Chi; Zhu, Chenyu; Yang, Jing; Zhang, Chao; Gu, Feng; Xu, Han; Wei, Jia; Liu, Qi

    2017-02-15

    CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-based gene editing has been widely implemented in various cell types and organisms. A major challenge in the effective application of the CRISPR system is the need to design highly efficient single-guide RNA (sgRNA) with minimal off-target cleavage. Several tools are available for sgRNA design, while limited tools were compared. In our opinion, benchmarking the performance of the available tools and indicating their applicable scenarios are important issues. Moreover, whether the reported sgRNA design rules are reproducible across different sgRNA libraries, cell types and organisms remains unclear. In our study, a systematic and unbiased benchmark of the sgRNA predicting efficacy was performed on nine representative on-target design tools, based on six benchmark data sets covering five different cell types. The benchmark study presented here provides novel quantitative insights into the available CRISPR tools. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Data to knowledge: how to get meaning from your result

    PubMed Central

    Berman, Helen M.; Gabanyi, Margaret J.; Groom, Colin R.; Johnson, John E.; Murshudov, Garib N.; Nicholls, Robert A.; Reddy, Vijay; Schwede, Torsten; Zimmerman, Matthew D.; Westbrook, John; Minor, Wladek

    2015-01-01

    Structural and functional studies require the development of sophisticated ‘Big Data’ technologies and software to increase the knowledge derived and ensure reproducibility of the data. This paper presents summaries of the Structural Biology Knowledge Base, the VIPERdb Virus Structure Database, evaluation of homology modeling by the Protein Model Portal, the ProSMART tool for conformation-independent structure comparison, the LabDB ‘super’ laboratory information management system and the Cambridge Structural Database. These techniques and technologies represent important tools for the transformation of crystallographic data into knowledge and information, in an effort to address the problem of non-reproducibility of experimental results. PMID:25610627

  18. Virtual Interactomics of Proteins from Biochemical Standpoint

    PubMed Central

    Kubrycht, Jaroslav; Sigler, Karel; Souček, Pavel

    2012-01-01

    Virtual interactomics represents a rapidly developing scientific area on the boundary line of bioinformatics and interactomics. Protein-related virtual interactomics then comprises instrumental tools for prediction, simulation, and networking of the majority of interactions important for structural and individual reproduction, differentiation, recognition, signaling, regulation, and metabolic pathways of cells and organisms. Here, we describe the main areas of virtual protein interactomics, that is, structurally based comparative analysis and prediction of functionally important interacting sites, mimotope-assisted and combined epitope prediction, molecular (protein) docking studies, and investigation of protein interaction networks. Detailed information about some interesting methodological approaches and online accessible programs or databases is displayed in our tables. Considerable part of the text deals with the searches for common conserved or functionally convergent protein regions and subgraphs of conserved interaction networks, new outstanding trends and clinically interesting results. In agreement with the presented data and relationships, virtual interactomic tools improve our scientific knowledge, help us to formulate working hypotheses, and they frequently also mediate variously important in silico simulations. PMID:22928109

  19. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology.

    PubMed

    Karp, Peter D; Latendresse, Mario; Paley, Suzanne M; Krummenacker, Markus; Ong, Quang D; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M; Caspi, Ron

    2016-09-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem

    NASA Technical Reports Server (NTRS)

    Hosheleva, Olga

    1997-01-01

    How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.

  1. Business Process-Based Resource Importance Determination

    NASA Astrophysics Data System (ADS)

    Fenz, Stefan; Ekelhart, Andreas; Neubauer, Thomas

    Information security risk management (ISRM) heavily depends on realistic impact values representing the resources’ importance in the overall organizational context. Although a variety of ISRM approaches have been proposed, well-founded methods that provide an answer to the following question are still missing: How can business processes be used to determine resources’ importance in the overall organizational context? We answer this question by measuring the actual importance level of resources based on business processes. Therefore, this paper presents our novel business process-based resource importance determination method which provides ISRM with an efficient and powerful tool for deriving realistic resource importance figures solely from existing business processes. The conducted evaluation has shown that the calculation results of the developed method comply to the results gained in traditional workshop-based assessments.

  2. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  3. Overview of Seismic Hazard and Vulnerability of Ordinary Buildings in Belgium: Methodological Aspects and Study Cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barszez, Anne-Marie; Camelbeeck, Thierry; Plumier, Andre

    Northwest Europe is a region in which damaging earthquakes exist. Assessing the risks of damages is useful, but this is not an easy work based on exact science.In this paper, we propose a general tool for a first level assessment of seismic risks (rapid diagnosis). General methodological aspects are presented. For a given building, the risk is represented by a volume in a multi-dimensional space. This space is defined by axes representing the main parameters that have an influence on the risk. We notably express the importance of including a parameter to consider the specific value of cultural heritage.Then wemore » apply the proposed tool to analyze and compare methods of seismic risk assessment used in Belgium. They differ by the spatial scale of the studied area. Study cases for the whole Belgian Territory and for part of cities in Liege and Mons (Be) aim also to give some sense of the overall risk in Belgium.« less

  4. CORE_TF: a user-friendly interface to identify evolutionary conserved transcription factor binding sites in sets of co-regulated genes

    PubMed Central

    Hestand, Matthew S; van Galen, Michiel; Villerius, Michel P; van Ommen, Gert-Jan B; den Dunnen, Johan T; 't Hoen, Peter AC

    2008-01-01

    Background The identification of transcription factor binding sites is difficult since they are only a small number of nucleotides in size, resulting in large numbers of false positives and false negatives in current approaches. Computational methods to reduce false positives are to look for over-representation of transcription factor binding sites in a set of similarly regulated promoters or to look for conservation in orthologous promoter alignments. Results We have developed a novel tool, "CORE_TF" (Conserved and Over-REpresented Transcription Factor binding sites) that identifies common transcription factor binding sites in promoters of co-regulated genes. To improve upon existing binding site predictions, the tool searches for position weight matrices from the TRANSFACR database that are over-represented in an experimental set compared to a random set of promoters and identifies cross-species conservation of the predicted transcription factor binding sites. The algorithm has been evaluated with expression and chromatin-immunoprecipitation on microarray data. We also implement and demonstrate the importance of matching the random set of promoters to the experimental promoters by GC content, which is a unique feature of our tool. Conclusion The program CORE_TF is accessible in a user friendly web interface at . It provides a table of over-represented transcription factor binding sites in the users input genes' promoters and a graphical view of evolutionary conserved transcription factor binding sites. In our test data sets it successfully predicts target transcription factors and their binding sites. PMID:19036135

  5. The Effect of Dynamic and Interactive Mathematics Learning Environments (DIMLE), Supporting Multiple Representations, on Perceptions of Elementary Mathematics Pre-Service Teachers in Problem Solving Process

    ERIC Educational Resources Information Center

    Ozdemir, S.; Reis, Z. Ayvaz

    2013-01-01

    Mathematics is an important discipline, providing crucial tools, such as problem solving, to improve our cognitive abilities. In order to solve a problem, it is better to envision and represent through multiple means. Multiple representations can help a person to redefine a problem with his/her own words in that envisioning process. Dynamic and…

  6. Paradox and Polarity: Tools For Managing Complexity

    DTIC Science & Technology

    2016-02-09

    Chapter Four will examine the significant influence of Clausewitz and Sun Tzu on the Western and Eastern philosophies of war, respectively. The importance...intellectual giants have shaped human thinking about war: Carl von Clausewitz and Sun Tzu . Clausewitz represents a western and Sun Tzu an eastern...that victory in the war fails to yield peace (e.g., the American experience in Afghanistan and Iraq). Sun Tzu has a very different understanding both

  7. Paradox and Polarity: Tools for Managing Complexity

    DTIC Science & Technology

    2016-04-19

    Chapter Four will examine the significant influence of Clausewitz and Sun Tzu on the Western and Eastern philosophies of war, respectively. The importance...intellectual giants have shaped human thinking about war: Carl von Clausewitz and Sun Tzu . Clausewitz represents a western and Sun Tzu an eastern...that victory in the war fails to yield peace (e.g., the American experience in Afghanistan and Iraq). Sun Tzu has a very different understanding both

  8. Process Definition and Modeling Guidebook. Version 01.00.02

    DTIC Science & Technology

    1992-12-01

    material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from

  9. Outcome measurement of hand function following mirror therapy for stroke rehabilitation: A systematic review.

    PubMed

    Cantero-Téllez, Raquel; Naughton, Nancy; Algar, Lori; Valdes, Kristin

    2018-02-28

    Systematic review. Mirror therapy is a treatment used to address hand function following a stroke. Measurement of outcomes using appropriate assessment tools is crucial; however, many assessment options exist. The purpose of this study is to systematically review outcome measures that are used to assess hand function following mirror therapy after stroke and, in addition, to identify the psychometric and descriptive properties of the included measures and through the linking process determine if the outcome measures are representative of the International Classification of Functioning, Disability and Health (ICF). Following a comprehensive literature search, outcome measures used in the included studies were linked to the ICF and analyzed based on descriptive information and psychometric properties. Eleven studies met inclusion criteria and included 24 different assessment tools to measure hand or upper limb function. Most outcome measures used in the selected studies (63%) were rated by the evaluating therapist. Thirteen outcome measures (54%) linked to the ICF body function category and 10 measures (42%) linked to activities and participation. One outcome measure was linked to not defined, and all other ICF categories were not represented. A majority of outcome measures have been assessed for validity, reliability, and responsiveness, but responsiveness was the least investigated psychometric property. Current studies on mirror therapy after stroke are not consistent in the assessment tools used to determine hand function. Understanding of study outcomes requires analysis of the assessment tools. The outcome measures used in the included studies are not representative of personal and environmental factors, but tools linking to body functions and activities and participations provide important information on functional outcome. Integrating a combination of measures that are psychometrically sound and reflective of the ICF should be considered for assessment of hand function after mirror therapy after stroke. Copyright © 2018 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  10. Intellectual Property: a powerful tool to develop biotech research.

    PubMed

    Giugni, Diego; Giugni, Valter

    2010-09-01

    Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. © 2010 The Authors; Journal compilation © 2010 Society for Applied Microbiology and Blackwell Publishing Ltd.

  11. Dependency visualization for complex system understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, J. Allison Cory

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less

  12. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.

    2008-08-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  13. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.

    2010-06-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  14. Validation of the tool assessment of clinical education (AssCE): A study using Delphi method and clinical experts.

    PubMed

    Löfmark, Anna; Mårtensson, Gunilla

    2017-03-01

    The aim of the present study was to establish the validity of the tool Assessment of Clinical Education (AssCE). The tool is widely used in Sweden and some Nordic countries for assessing nursing students' performance in clinical education. It is important that the tools in use be subjected to regular audit and critical reviews. The validation process, performed in two stages, was concluded with a high level of congruence. In the first stage, Delphi technique was used to elaborate the AssCE tool using a group of 35 clinical nurse lecturers. After three rounds, we reached consensus. In the second stage, a group of 46 clinical nurse lecturers representing 12 universities in Sweden and Norway audited the revised version of the AssCE in relation to learning outcomes from the last clinical course at their respective institutions. Validation of the revised AssCE was established with high congruence between the factors in the AssCE and examined learning outcomes. The revised AssCE tool seems to meet its objective to be a validated assessment tool for use in clinical nursing education. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. The SEQUEST Family Tree

    NASA Astrophysics Data System (ADS)

    Tabb, David L.

    2015-11-01

    Since its introduction in 1994, SEQUEST has gained many important new capabilities, and a host of successor algorithms have built upon its successes. This Account and Perspective maps the evolution of this important tool and charts the relationships among contributions to the SEQUEST legacy. Many of the changes represented improvements in computing speed by clusters and graphics cards. Mass spectrometry innovations in mass accuracy and activation methods led to shifts in fragment modeling and scoring strategies. These changes, as well as the movement of laboratories and lab members, have led to great diversity among the members of the SEQUEST family.

  16. Antimicrobial Stewardship in the Emergency Department and Guidelines for Development

    PubMed Central

    May, Larissa; Cosgrove, Sara; L’Archeveque, Michelle; Talan, David A.; Payne, Perry; Rothman, Richard E.

    2013-01-01

    Antimicrobial resistance is a mounting public health concern. Emergency departments (EDs) represent a particularly important setting for addressing inappropriate antimicrobial prescribing practices, given the frequent use of antibiotics in this setting that sits at the interface of the community and the hospital. This article outlines the importance of antimicrobial stewardship in the ED setting and provides practical recommendations drawn from existing evidence for the application of various strategies and tools that could be implemented in the ED including advancement of clinical guidelines, clinical decision support systems, rapid diagnostics, and expansion of ED pharmacist programs. PMID:23122955

  17. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  18. Echocardiography for patent ductus arteriosus including closure in adults.

    PubMed

    Chugh, Reema; Salem, Morris M

    2015-01-01

    Patent ductus arteriosus (PDA) represents at least 5-10% of all congenital heart defects (CHDs) making it a very important commonly diagnosed lesion. Although spontaneous closure of the PDA occurs within 24 to 48 hours after birth in the majority, those children who do not have natural or surgical closure may have a persistent PDA into adulthood. The diagnosis is most often confirmed by echocardiography that also guides catheter-based interventions and surgeries. Echocardiography continues to be the most important tool in long-term follow-up of residua and sequelae. © 2014, Wiley Periodicals, Inc.

  19. Models with Men and Women: Representing Gender in Dynamic Modeling of Social Systems.

    PubMed

    Palmer, Erika; Wilson, Benedicte

    2018-04-01

    Dynamic engineering models have yet to be evaluated in the context of feminist engineering ethics. Decision-making concerning gender in dynamic modeling design is a gender and ethical issue that is important to address regardless of the system in which the dynamic modeling is applied. There are many dynamic modeling tools that operationally include the female population, however, there is an important distinction between females and women; it is the difference between biological sex and the social construct of gender, which is fluid and changes over time and geography. The ethical oversight in failing to represent or misrepresenting gender in model design when it is relevant to the model purpose can have implications for model validity and policy model development. This paper highlights this gender issue in the context of feminist engineering ethics using a dynamic population model. Women are often represented in this type of model only in their biological capacity, while lacking their gender identity. This illustrative example also highlights how language, including the naming of variables and communication with decision-makers, plays a role in this gender issue.

  20. Connected health and multiple sclerosis.

    PubMed

    Cohen, M

    2018-06-01

    There is as yet no consensual definition of "connected health". In general, the term refers to the growing use of technology and, in particular, mobile technology in medicine. Over the past 10 years, there have been an increasing number of published reports on the wide-ranging and heterogeneous fields involving the application of technology in medicine, ranging from telemedicine to tools to improve patients' evaluation and monitoring by physicians, as well as a multitude of patient-centered applications. They also represent promising tools in the field of clinical research. This report is a review of the importance of using this technology in the management of multiple sclerosis patients. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  1. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollingsworth, Jeff

    2014-07-31

    The purpose of this project was to develop tools and techniques to improve the ability of computational scientists to investigate and correct problems (bugs) in their programs. Specifically, the University of Maryland component of this project focused on the problems associated with the finite number of bits available in a computer to represent numeric values. In large scale scientific computation, numbers are frequently added to and multiplied with each other billions of times. Thus even small errors due to the representation of numbers can accumulate into big errors. However, using too many bits to represent a number results in additionalmore » computation, memory, and energy costs. Thus it is critical to find the right size for numbers. This project focused on several aspects of this general problem. First, we developed a tool to look for cancelations, the catastrophic loss of precision in numbers due to the addition of two numbers whose actual values are close to each other, but whose representation in a computer is identical or nearly so. Second, we developed a suite of tools to allow programmers to identify exactly how much precision is required for each operation in their program. This tool allows programmers to both verify that enough precision is available, but more importantly find cases where extra precision could be eliminated to allow the program to use less memory, computer time, or energy. These tools use advanced binary modification techniques to allow the analysis of actual optimized code. The system, called Craft, has been applied to a number of benchmarks and real applications.« less

  2. Implementation and evaluation of health passport communication tools in emergency departments.

    PubMed

    Heifetz, Marina; Lunsky, Yona

    2018-01-01

    People with IDD (intellectual or developmental disabilities) and their families consistently report dissatisfaction with their emergency department experience. Clear care plans and communication tools may not only improve the quality of patient care, but also can prevent unnecessary visits and reduce the likelihood of return visits. To evaluate communication tools to be used by people with IDD in psychiatric and general emergency departments in three different regions of Ontario. Health passport communication tools were locally tailored and implemented in each of the three regions. A total of 28 questionnaires and 18 interviews with stakeholders (e.g., hospital staff, community agency representatives, families) were completed across the regions to obtain feedback on the implementation of health passports with people with IDD. Participants felt that the health passport tools provided helpful information, improved communication between patients with IDD and hospital staff, and were user friendly. Continued efforts are needed to work with communities on maintenance of this tool, ensuring all hospital staff are utilizing the information. These findings emphasize the merits of health passport tools being implemented in the health system to support communication between patients with IDD and health care practitioners and the importance of tailoring tools to local settings. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Characteristics of good quality pharmaceutical services common to community pharmacies and dispensing general practices.

    PubMed

    Grey, Elisabeth; Harris, Michael; Rodham, Karen; Weiss, Marjorie C

    2016-10-01

    In the United Kingdom, pharmaceutical services can be delivered by both community pharmacies (CPs) and dispensing doctor practices (DPs). Both must adhere to minimum standards set out in NHS regulations; however, no common framework exists to guide quality improvement. Previous phases of this research had developed a set of characteristics indicative of good pharmaceutical service provision. To ask key stakeholders to confirm, and rank the importance of, a set of characteristics of good pharmaceutical service provision. A two-round Delphi-type survey was conducted in south-west England and was sent to participants representing three stakeholder groups: DPs, CPs and patients/lay members. Participants were asked to confirm, and rank, the importance of these characteristics as representing good quality pharmaceutical services. Thirty people were sent the first round survey; 22 participants completed both rounds. Median ratings for the 23 characteristics showed that all were seen to represent important aspects of pharmaceutical service provision. Participants' comments highlighted potential problems with the practicality of the characteristics. Characteristics relating to patient safety were deemed to be the most important and those relating to public health the least important. A set of 23 characteristics for providing good pharmaceutical services in CPs and DPs was developed and attained approval from a sample of stakeholders. With further testing and wider discussion, it is hoped that the characteristics will form the basis of a quality improvement tool for CPs and DPs. © 2016 Royal Pharmaceutical Society.

  4. Interactive Planning under Uncertainty with Casual Modeling and Analysis

    DTIC Science & Technology

    2006-01-01

    Tool ( CAT ), a system for creating and analyzing causal models similar to Bayes networks. In order to use CAT as a tool for planning, users go through...an iterative process in which they use CAT to create and an- alyze alternative plans. One of the biggest difficulties is that the number of possible...Causal Analysis Tool ( CAT ), which is a tool for representing and analyzing causal networks sim- ilar to Bayesian networks. In order to represent plans

  5. Evolvable social agents for bacterial systems modeling.

    PubMed

    Paton, Ray; Gregory, Richard; Vlachos, Costas; Saunders, Jon; Wu, Henry

    2004-09-01

    We present two approaches to the individual-based modeling (IbM) of bacterial ecologies and evolution using computational tools. The IbM approach is introduced, and its important complementary role to biosystems modeling is discussed. A fine-grained model of bacterial evolution is then presented that is based on networks of interactivity between computational objects representing genes and proteins. This is followed by a coarser grained agent-based model, which is designed to explore the evolvability of adaptive behavioral strategies in artificial bacteria represented by learning classifier systems. The structure and implementation of the two proposed individual-based bacterial models are discussed, and some results from simulation experiments are presented, illustrating their adaptive properties.

  6. Functional Characterization of Two scFv-Fc Antibodies from an HIV Controller Selected on Soluble HIV-1 Env Complexes: A Neutralizing V3- and a Trimer-Specific gp41 Antibody

    PubMed Central

    Trott, Maria; Weiß, Svenja; Antoni, Sascha; Koch, Joachim; von Briesen, Hagen; Hust, Michael; Dietrich, Ursula

    2014-01-01

    HIV neutralizing antibodies (nAbs) represent an important tool in view of prophylactic and therapeutic applications for HIV-1 infection. Patients chronically infected by HIV-1 represent a valuable source for nAbs. HIV controllers, including long-term non-progressors (LTNP) and elite controllers (EC), represent an interesting subgroup in this regard, as here nAbs can develop over time in a rather healthy immune system and in the absence of any therapeutic selection pressure. In this study, we characterized two particular antibodies that were selected as scFv antibody fragments from a phage immune library generated from an LTNP with HIV neutralizing antibodies in his plasma. The phage library was screened on recombinant soluble gp140 envelope (Env) proteins. Sequencing the selected peptide inserts revealed two major classes of antibody sequences. Binding analysis of the corresponding scFv-Fc derivatives to various trimeric and monomeric Env constructs as well as to peptide arrays showed that one class, represented by monoclonal antibody (mAb) A2, specifically recognizes an epitope localized in the pocket binding domain of the C heptad repeat (CHR) in the ectodomain of gp41, but only in the trimeric context. Thus, this antibody represents an interesting tool for trimer identification. MAb A7, representing the second class, binds to structural elements of the third variable loop V3 and neutralizes tier 1 and tier 2 HIV-1 isolates of different subtypes with matching critical amino acids in the linear epitope sequence. In conclusion, HIV controllers are a valuable source for the selection of functionally interesting antibodies that can be selected on soluble gp140 proteins with properties from the native envelope spike. PMID:24828352

  7. NAP: The Network Analysis Profiler, a web tool for easier topological analysis and comparison of medium-scale biological networks.

    PubMed

    Theodosiou, Theodosios; Efstathiou, Georgios; Papanikolaou, Nikolas; Kyrpides, Nikos C; Bagos, Pantelis G; Iliopoulos, Ioannis; Pavlopoulos, Georgios A

    2017-07-14

    Nowadays, due to the technological advances of high-throughput techniques, Systems Biology has seen a tremendous growth of data generation. With network analysis, looking at biological systems at a higher level in order to better understand a system, its topology and the relationships between its components is of a great importance. Gene expression, signal transduction, protein/chemical interactions, biomedical literature co-occurrences, are few of the examples captured in biological network representations where nodes represent certain bioentities and edges represent the connections between them. Today, many tools for network visualization and analysis are available. Nevertheless, most of them are standalone applications that often (i) burden users with computing and calculation time depending on the network's size and (ii) focus on handling, editing and exploring a network interactively. While such functionality is of great importance, limited efforts have been made towards the comparison of the topological analysis of multiple networks. Network Analysis Provider (NAP) is a comprehensive web tool to automate network profiling and intra/inter-network topology comparison. It is designed to bridge the gap between network analysis, statistics, graph theory and partially visualization in a user-friendly way. It is freely available and aims to become a very appealing tool for the broader community. It hosts a great plethora of topological analysis methods such as node and edge rankings. Few of its powerful characteristics are: its ability to enable easy profile comparisons across multiple networks, find their intersection and provide users with simplified, high quality plots of any of the offered topological characteristics against any other within the same network. It is written in R and Shiny, it is based on the igraph library and it is able to handle medium-scale weighted/unweighted, directed/undirected and bipartite graphs. NAP is available at http://bioinformatics.med.uoc.gr/NAP .

  8. Application of Genomic Technologies to the Breeding of Trees

    PubMed Central

    Badenes, Maria L.; Fernández i Martí, Angel; Ríos, Gabino; Rubio-Cabetas, María J.

    2016-01-01

    The recent introduction of next generation sequencing (NGS) technologies represents a major revolution in providing new tools for identifying the genes and/or genomic intervals controlling important traits for selection in breeding programs. In perennial fruit trees with long generation times and large sizes of adult plants, the impact of these techniques is even more important. High-throughput DNA sequencing technologies have provided complete annotated sequences in many important tree species. Most of the high-throughput genotyping platforms described are being used for studies of genetic diversity and population structure. Dissection of complex traits became possible through the availability of genome sequences along with phenotypic variation data, which allow to elucidate the causative genetic differences that give rise to observed phenotypic variation. Association mapping facilitates the association between genetic markers and phenotype in unstructured and complex populations, identifying molecular markers for assisted selection and breeding. Also, genomic data provide in silico identification and characterization of genes and gene families related to important traits, enabling new tools for molecular marker assisted selection in tree breeding. Deep sequencing of transcriptomes is also a powerful tool for the analysis of precise expression levels of each gene in a sample. It consists in quantifying short cDNA reads, obtained by NGS technologies, in order to compare the entire transcriptomes between genotypes and environmental conditions. The miRNAs are non-coding short RNAs involved in the regulation of different physiological processes, which can be identified by high-throughput sequencing of RNA libraries obtained by reverse transcription of purified short RNAs, and by in silico comparison with known miRNAs from other species. All together, NGS techniques and their applications have increased the resources for plant breeding in tree species, closing the former gap of genetic tools between trees and annual species. PMID:27895664

  9. Application of Genomic Technologies to the Breeding of Trees.

    PubMed

    Badenes, Maria L; Fernández I Martí, Angel; Ríos, Gabino; Rubio-Cabetas, María J

    2016-01-01

    The recent introduction of next generation sequencing (NGS) technologies represents a major revolution in providing new tools for identifying the genes and/or genomic intervals controlling important traits for selection in breeding programs. In perennial fruit trees with long generation times and large sizes of adult plants, the impact of these techniques is even more important. High-throughput DNA sequencing technologies have provided complete annotated sequences in many important tree species. Most of the high-throughput genotyping platforms described are being used for studies of genetic diversity and population structure. Dissection of complex traits became possible through the availability of genome sequences along with phenotypic variation data, which allow to elucidate the causative genetic differences that give rise to observed phenotypic variation. Association mapping facilitates the association between genetic markers and phenotype in unstructured and complex populations, identifying molecular markers for assisted selection and breeding. Also, genomic data provide in silico identification and characterization of genes and gene families related to important traits, enabling new tools for molecular marker assisted selection in tree breeding. Deep sequencing of transcriptomes is also a powerful tool for the analysis of precise expression levels of each gene in a sample. It consists in quantifying short cDNA reads, obtained by NGS technologies, in order to compare the entire transcriptomes between genotypes and environmental conditions. The miRNAs are non-coding short RNAs involved in the regulation of different physiological processes, which can be identified by high-throughput sequencing of RNA libraries obtained by reverse transcription of purified short RNAs, and by in silico comparison with known miRNAs from other species. All together, NGS techniques and their applications have increased the resources for plant breeding in tree species, closing the former gap of genetic tools between trees and annual species.

  10. Health economics in public health.

    PubMed

    Ammerman, Alice S; Farrelly, Matthew A; Cavallo, David N; Ickes, Scott B; Hoerger, Thomas J

    2009-03-01

    Economic analysis is an important tool in deciding how to allocate scarce public health resources; however, there is currently a dearth of such analysis by public health researchers. Public health researchers and practitioners were surveyed to determine their current use of health economics and to identify barriers to use as well as potential strategies to decrease those barriers in order to allow them to more effectively incorporate economic analyses into their work. Data collected from five focus groups informed survey development. The survey included a demographic section and 14 multi-part questions. Participants were recruited in 2006 from three national public health organizations through e-mail; 294 academicians, practitioners, and community representatives answered the survey. Survey data were analyzed in 2007. Despite an expressed belief in the importance of health economics, more than half of the respondents reported very little or no current use of health economics in their work. Of those using health economics, cost-benefit and cost-effectiveness analysis and determination of public health costs were cited as the measures used most frequently. The most important barriers were lack of expertise, funding, time, tools, and data, as well as discomfort with economic theory. The resource deemed most important to using health economics was collaboration with economists or those with economic training. Respondents indicated a desire to learn more about health economics and tools for performing economic analysis. Given the importance of incorporating economic analysis into public health interventions, and the desire of survey respondents for more collaboration with health economists, opportunities for such collaborations should be increased.

  11. SeeGH--a software tool for visualization of whole genome array comparative genomic hybridization data.

    PubMed

    Chi, Bryan; DeLeeuw, Ronald J; Coe, Bradley P; MacAulay, Calum; Lam, Wan L

    2004-02-09

    Array comparative genomic hybridization (CGH) is a technique which detects copy number differences in DNA segments. Complete sequencing of the human genome and the development of an array representing a tiling set of tens of thousands of DNA segments spanning the entire human genome has made high resolution copy number analysis throughout the genome possible. Since array CGH provides signal ratio for each DNA segment, visualization would require the reassembly of individual data points into chromosome profiles. We have developed a visualization tool for displaying whole genome array CGH data in the context of chromosomal location. SeeGH is an application that translates spot signal ratio data from array CGH experiments to displays of high resolution chromosome profiles. Data is imported from a simple tab delimited text file obtained from standard microarray image analysis software. SeeGH processes the signal ratio data and graphically displays it in a conventional CGH karyotype diagram with the added features of magnification and DNA segment annotation. In this process, SeeGH imports the data into a database, calculates the average ratio and standard deviation for each replicate spot, and links them to chromosome regions for graphical display. Once the data is displayed, users have the option of hiding or flagging DNA segments based on user defined criteria, and retrieve annotation information such as clone name, NCBI sequence accession number, ratio, base pair position on the chromosome, and standard deviation. SeeGH represents a novel software tool used to view and analyze array CGH data. The software gives users the ability to view the data in an overall genomic view as well as magnify specific chromosomal regions facilitating the precise localization of genetic alterations. SeeGH is easily installed and runs on Microsoft Windows 2000 or later environments.

  12. Personal Electronic Health Records: Understanding User Requirements and Needs in Chronic Cancer Care

    PubMed Central

    Winkler, Eva; Kamradt, Martina; Längst, Gerda; Eckrich, Felicitas; Heinze, Oliver; Bergh, Bjoern; Szecsenyi, Joachim; Ose, Dominik

    2015-01-01

    Background The integration of new information and communication technologies (ICTs) is becoming increasingly important in reorganizing health care. Adapting ICTs as supportive tools to users' needs and daily practices is vital for adoption and use. Objective In order to develop a Web-based personal electronic health record (PEPA), we explored user requirements and needs with regard to desired information and functions. Methods A qualitative study across health care sectors and health professions was conducted in a regional health care setting in Germany. Overall, 10 semistructured focus groups were performed, collecting views of 3 prospective user groups: patients with colorectal cancer (n=12) and representatives from patient support groups (n=2), physicians (n=17), and non-medical HCPs (n=16). Data were audio- and videotaped, transcribed verbatim, and thematically analyzed using qualitative content analysis. Results For both patients and HCPs, it was central to have a tool representing the chronology of illness and its care processes, for example, patients wanted to track their long-term laboratory findings (eg, tumor markers). Designing health information in a patient accessible way was highlighted as important. Users wanted to have general and tumor-specific health information available in a PEPA. Functions such as filtering information and adding information by patients (eg, on their well-being or electronic communication with HCPs via email) were discussed. Conclusions In order to develop a patient/user centered tool that is tailored to user needs, it is essential to address their perspectives. A challenge for implementation will be how to design PEPA’s health data in a patient accessible way. Adequate patient support and technical advice for users have to be addressed. PMID:25998006

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Hua

    Combustion represents a key chemical process in energy consumption in modern societies and a clear and comprehensive understanding of the elemental reactions in combustion is of great importance to a number of challenging areas such as engine efficiency and environmental protection. In this award, we proposed to develop new theoretical tools to understand elemental chemical processes in combustion environments. With the support of this DOE grant, we have made significant advances in developing new and more efficient and accurate algorithms to characterize reaction dynamics.

  14. Complete mitochondrial genome phylogeographic analysis of killer whales (Orcinus orca) indicates multiple species.

    PubMed

    Morin, Phillip A; Archer, Frederick I; Foote, Andrew D; Vilstrup, Julia; Allen, Eric E; Wade, Paul; Durban, John; Parsons, Kim; Pitman, Robert; Li, Lewyn; Bouffard, Pascal; Abel Nielsen, Sandra C; Rasmussen, Morten; Willerslev, Eske; Gilbert, M Thomas P; Harkins, Timothy

    2010-07-01

    Killer whales (Orcinus orca) currently comprise a single, cosmopolitan species with a diverse diet. However, studies over the last 30 yr have revealed populations of sympatric "ecotypes" with discrete prey preferences, morphology, and behaviors. Although these ecotypes avoid social interactions and are not known to interbreed, genetic studies to date have found extremely low levels of diversity in the mitochondrial control region, and few clear phylogeographic patterns worldwide. This low level of diversity is likely due to low mitochondrial mutation rates that are common to cetaceans. Using killer whales as a case study, we have developed a method to readily sequence, assemble, and analyze complete mitochondrial genomes from large numbers of samples to more accurately assess phylogeography and estimate divergence times. This represents an important tool for wildlife management, not only for killer whales but for many marine taxa. We used high-throughput sequencing to survey whole mitochondrial genome variation of 139 samples from the North Pacific, North Atlantic, and southern oceans. Phylogenetic analysis indicated that each of the known ecotypes represents a strongly supported clade with divergence times ranging from approximately 150,000 to 700,000 yr ago. We recommend that three named ecotypes be elevated to full species, and that the remaining types be recognized as subspecies pending additional data. Establishing appropriate taxonomic designations will greatly aid in understanding the ecological impacts and conservation needs of these important marine predators. We predict that phylogeographic mitogenomics will become an important tool for improved statistical phylogeography and more precise estimates of divergence times.

  15. Complete mitochondrial genome phylogeographic analysis of killer whales (Orcinus orca) indicates multiple species

    PubMed Central

    Morin, Phillip A.; Archer, Frederick I.; Foote, Andrew D.; Vilstrup, Julia; Allen, Eric E.; Wade, Paul; Durban, John; Parsons, Kim; Pitman, Robert; Li, Lewyn; Bouffard, Pascal; Abel Nielsen, Sandra C.; Rasmussen, Morten; Willerslev, Eske; Gilbert, M. Thomas P.; Harkins, Timothy

    2010-01-01

    Killer whales (Orcinus orca) currently comprise a single, cosmopolitan species with a diverse diet. However, studies over the last 30 yr have revealed populations of sympatric “ecotypes” with discrete prey preferences, morphology, and behaviors. Although these ecotypes avoid social interactions and are not known to interbreed, genetic studies to date have found extremely low levels of diversity in the mitochondrial control region, and few clear phylogeographic patterns worldwide. This low level of diversity is likely due to low mitochondrial mutation rates that are common to cetaceans. Using killer whales as a case study, we have developed a method to readily sequence, assemble, and analyze complete mitochondrial genomes from large numbers of samples to more accurately assess phylogeography and estimate divergence times. This represents an important tool for wildlife management, not only for killer whales but for many marine taxa. We used high-throughput sequencing to survey whole mitochondrial genome variation of 139 samples from the North Pacific, North Atlantic, and southern oceans. Phylogenetic analysis indicated that each of the known ecotypes represents a strongly supported clade with divergence times ranging from ∼150,000 to 700,000 yr ago. We recommend that three named ecotypes be elevated to full species, and that the remaining types be recognized as subspecies pending additional data. Establishing appropriate taxonomic designations will greatly aid in understanding the ecological impacts and conservation needs of these important marine predators. We predict that phylogeographic mitogenomics will become an important tool for improved statistical phylogeography and more precise estimates of divergence times. PMID:20413674

  16. "Proximal Sensing" capabilities for snow cover monitoring

    NASA Astrophysics Data System (ADS)

    Valt, Mauro; Salvatori, Rosamaria; Plini, Paolo; Salzano, Roberto; Giusti, Marco; Montagnoli, Mauro; Sigismondi, Daniele; Cagnati, Anselmo

    2013-04-01

    The seasonal snow cover represents one of the most important land cover class in relation to environmental studies in mountain areas, especially considering its variation during time. Snow cover and its extension play a relevant role for the studies on the atmospheric dynamics and the evolution of climate. It is also important for the analysis and management of water resources and for the management of touristic activities in mountain areas. Recently, webcam images collected at daily or even hourly intervals are being used as tools to observe the snow covered areas; those images, properly processed, can be considered a very important environmental data source. Images captured by digital cameras become a useful tool at local scale providing images even when the cloud coverage makes impossible the observation by satellite sensors. When suitably processed these images can be used for scientific purposes, having a good resolution (at least 800x600x16 million colours) and a very good sampling frequency (hourly images taken through the whole year). Once stored in databases, those images represent therefore an important source of information for the study of recent climatic changes, to evaluate the available water resources and to analyse the daily surface evolution of the snow cover. The Snow-noSnow software has been specifically designed to automatically detect the extension of snow cover collected from webcam images with a very limited human intervention. The software was tested on images collected on Alps (ARPAV webcam network) and on Apennine in a pilot station properly equipped for this project by CNR-IIA. The results obtained through the use of Snow-noSnow are comparable to the one achieved by photo-interpretation and could be considered as better as the ones obtained using the image segmentation routine implemented into image processing commercial softwares. Additionally, Snow-noSnow operates in a semi-automatic way and has a reduced processing time. The analysis of this kind of images could represent an useful element to support the interpretation of remote sensing images, especially those provided by high spatial resolution sensors. Keywords: snow cover monitoring, digital images, software, Alps, Apennines.

  17. Ethnobotanical and economic value of Ravenala madagascariensis Sonn. in Eastern Madagascar.

    PubMed

    Rakotoarivelo, Nivo; Razanatsima, Aina; Rakotoarivony, Fortunat; Rasoaviety, Lucien; Ramarosandratana, Aro Vonjy; Jeannoda, Vololoniaina; Kuhlman, Alyse R; Randrianasolo, Armand; Bussmann, Rainer W

    2014-07-15

    Known worldwide as the "traveler's tree", the Malagasy endemic species Ravenala madagascariensis Sonn. (Strelitziaceae) is considered as an iconic symbol of Madagascar. It is a widespread species in the eastern part of the country with four different varieties which are well represented in Ambalabe community. All of them are used for different purposes and the species represents an important cultural value in the lives of the local population. However, uses of Ravenala are only generally well known by local population. Thus, in this study, we report on the different uses of Ravenala and its importance to the Ambalabe local people. Semi-structured interviews among 116 people, 59 men and 57 women with ages ranging from 17 to 84 years old, free listing and market surveys were conducted in order to collect the vernacular names, the uses of Ravenala madagascariensis and the price of plant parts sold in local market. Then, the uses were categorized according to Cámara-Leret et al. classification. Different parts of the plant are currently used by local population, which are grouped as heart, trunk, leaves, petioles and rachis. Seven categories of use were recorded, most cited include: human food, utensils and tools, and house building. The most commonly used parts are trunk, heart, leaves and petioles for which the price varies between $3-15. Uses mentioned for construction (floor, roofs and wall), human food and utensils and tools are the most frequent and salient for local population. But the use of the plant as first materials for house building is revealed to be the most important for them. Ravenala madagascariensis is very important to the Ambalabe communities because for local population, it represents the Betsimisaraka cultural and traditional use of the plant for house building. Moreover, none of its parts are discarded. The harvest and sale of R. madagascariensis for building materials can also provide an additional source of income to the family. Besides, using Ravenala in house construction reduces the use of slow growing trees and contributes to the sustainable use of natural forest resources.

  18. Planning for disaster resilience in rural, remote, and coastal communities: moving from thought to action.

    PubMed

    Murphy, Brenda L; Anderson, Gregory S; Bowles, Ron; Cox, Robin S

    2014-01-01

    Disaster resilience is the cornerstone of effective emergency management across all phases of a disaster from preparedness through response and recovery. To support community resilience planning in the Rural Disaster Resilience Project (RDRP) Planning Framework, a print-based version of the guide book and a suite of resilience planning tools were field tested in three communities representing different regions and geographies within Canada. The results provide a cross-case study analysis from which lessons learned can be extracted. The authors demonstrate that by encouraging resilience thinking and proactive planning even very small rural communities can harness their inherent strengths and resources to enhance their own disaster resilience, as undertaking the resilience planning process was as important as the outcomes.The resilience enhancement planning process must be flexible enough to allow each community to act independently to meet their own needs. The field sites demonstrate that any motivated group of individuals, representing a neighborhood or some larger area could undertake a resilience initiative, especially with the assistance of a bridging organization or tool such as the RDRP Planning Framework.

  19. The HepaRG cell line: biological properties and relevance as a tool for cell biology, drug metabolism, and virology studies.

    PubMed

    Marion, Marie-Jeanne; Hantz, Olivier; Durantel, David

    2010-01-01

    Liver progenitor cells may play an important role in carcinogenesis in vivo and represent therefore useful cellular materials for in vitro studies. The HepaRG cell line, which is a human bipotent progenitor cell line capable to differentiate toward two different cell phenotypes (i.e., biliary-like and hepatocyte-like cells), has been established from a liver tumor associated with chronic hepatitis C. This cell line represents a valuable alternative to ex vivo cultivated primary human hepatocytes (PHH), as HepaRG cells share some features and properties with adult hepatocytes. The cell line is particularly useful to evaluate drugs and perform drug metabolism studies, as many detoxifying enzymes are expressed and functional. It is also an interesting tool to study some aspect of progenitor biology (e.g., differentiation process), carcinogenesis, and the infection by some pathogens for which the cell line is permissive (e.g., HBV infection). Overall, this chapter gives a concise overview of the biological properties and potential applications of this cell line.

  20. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. © 2014 Fraunhofer-Institute for Integrated Circuits IIS Journal of Microscopy © 2014 Royal Microscopical Society.

  1. Hemostats, sealants, and adhesives: components of the surgical toolbox.

    PubMed

    Spotnitz, William D; Burks, Sandra

    2008-07-01

    The surgical toolbox is expanding, and newer products are being developed to improve results. Reducing blood loss so that bloodless surgery can be performed may help minimize morbidity and length of stay. As patients, hospital administrators, and government regulators desire less invasive procedures, the surgical technical challenge is increasing. More operations are being performed through minimally invasive incisions with laparoscopic, endoscopic, and robotic approaches. In this setting, tools that can reduce bleeding by causing blood to clot, sealing vessels, or gluing tissues are gaining an increasing importance. Thus, hemostats, sealants, and adhesives are becoming a more important element of surgical practice. This review is designed to facilitate the reader's basic knowledge of these tools so that informed choices are made for controlling bleeding in specific clinical situations. Such information is useful for all members of the operative team. The team includes surgeons, anesthesiologists, residents, and nurses as well as hematologists and other medical specialists who may be involved in the perioperative care of surgical patients. An understanding of these therapeutic options may also be helpful to the transfusion service. In some cases, these materials may be stored in the blood bank, and their appropriate use may reduce demand for other transfusion components. The product classification used in this review includes hemostats as represented by product categories that include mechanical agents, active agents, flowables, and fibrin sealants; sealants as represented by fibrin sealants and polyethylene glycol hydrogels; and adhesives as represented by cyanoacrylates and albumin cross-linked with glutaraldehyde. Only those agents approved by the Food and Drug Administration (FDA) and presently available (February 2008) for sale in the United States are discussed in this review.

  2. CellNetVis: a web tool for visualization of biological networks using force-directed layout constrained by cellular components.

    PubMed

    Heberle, Henry; Carazzolle, Marcelo Falsarella; Telles, Guilherme P; Meirelles, Gabriela Vaz; Minghim, Rosane

    2017-09-13

    The advent of "omics" science has brought new perspectives in contemporary biology through the high-throughput analyses of molecular interactions, providing new clues in protein/gene function and in the organization of biological pathways. Biomolecular interaction networks, or graphs, are simple abstract representations where the components of a cell (e.g. proteins, metabolites etc.) are represented by nodes and their interactions are represented by edges. An appropriate visualization of data is crucial for understanding such networks, since pathways are related to functions that occur in specific regions of the cell. The force-directed layout is an important and widely used technique to draw networks according to their topologies. Placing the networks into cellular compartments helps to quickly identify where network elements are located and, more specifically, concentrated. Currently, only a few tools provide the capability of visually organizing networks by cellular compartments. Most of them cannot handle large and dense networks. Even for small networks with hundreds of nodes the available tools are not able to reposition the network while the user is interacting, limiting the visual exploration capability. Here we propose CellNetVis, a web tool to easily display biological networks in a cell diagram employing a constrained force-directed layout algorithm. The tool is freely available and open-source. It was originally designed for networks generated by the Integrated Interactome System and can be used with networks from others databases, like InnateDB. CellNetVis has demonstrated to be applicable for dynamic investigation of complex networks over a consistent representation of a cell on the Web, with capabilities not matched elsewhere.

  3. Multidimensional stock network analysis: An Escoufier's RV coefficient approach

    NASA Astrophysics Data System (ADS)

    Lee, Gan Siew; Djauhari, Maman A.

    2013-09-01

    The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.

  4. State-of-the-art characterization techniques for advanced lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Wu, Tianpin; Amine, Khalil

    2017-03-01

    To meet future needs for industries from personal devices to automobiles, state-of-the-art rechargeable lithium-ion batteries will require both improved durability and lowered costs. To enhance battery performance and lifetime, understanding electrode degradation mechanisms is of critical importance. Various advanced in situ and operando characterization tools developed during the past few years have proven indispensable for optimizing battery materials, understanding cell degradation mechanisms, and ultimately improving the overall battery performance. Here we review recent progress in the development and application of advanced characterization techniques such as in situ transmission electron microscopy for high-performance lithium-ion batteries. Using three representative electrode systems—layered metal oxides, Li-rich layered oxides and Si-based or Sn-based alloys—we discuss how these tools help researchers understand the battery process and design better battery systems. We also summarize the application of the characterization techniques to lithium-sulfur and lithium-air batteries and highlight the importance of those techniques in the development of next-generation batteries.

  5. Clinical nursing informatics. Developing tools for knowledge workers.

    PubMed

    Ozbolt, J G; Graves, J R

    1993-06-01

    Current research in clinical nursing informatics is proceeding along three important dimensions: (1) identifying and defining nursing's language and structuring its data; (2) understanding clinical judgment and how computer-based systems can facilitate and not replace it; and (3) discovering how well-designed systems can transform nursing practice. A number of efforts are underway to find and use language that accurately represents nursing and that can be incorporated into computer-based information systems. These efforts add to understanding nursing problems, interventions, and outcomes, and provide the elements for databases from which nursing's costs and effectiveness can be studied. Research on clinical judgment focuses on how nurses (perhaps with different levels of expertise) assess patient needs, set goals, and plan and deliver care, as well as how computer-based systems can be developed to aid these cognitive processes. Finally, investigators are studying not only how computers can help nurses with the mechanics and logistics of processing information but also and more importantly how access to informatics tools changes nursing care.

  6. The Dynamical Dipole Radiation in Dissipative Collisions with Exotic Beams

    NASA Astrophysics Data System (ADS)

    di Toro, M.; Colonna, M.; Rizzo, C.; Baran, V.

    Heavy Ion Collisions (HIC) represent a unique tool to probe the in-medium nuclear interaction in regions away from saturation. In this work we present a selection of reaction observables in dissipative collisions particularly sensitive to the isovector part of the interaction, i.e. to the symmetry term of the nuclear Equation of State (EoS). At low energies the behavior of the symmetry energy around saturation influences dissipation and fragment production mechanisms. We will first discuss the recently observed Dynamical Dipole Radiation, due to a collective neutron-proton oscillation during the charge equilibration in fusion and deep-inelastic collisions. We will review in detail all the main properties, yield, spectrum, damping and angular distributions, revealing important isospin effects. Reactions induced by unstable 132Sn beams appear to be very promising tools to test the sub-saturation Isovector EoS. Predictions are also presented for deep-inelastic and fragmentation collisions induced by neutron rich projectiles. The importance of studying violent collisions with radioactive beams at low and Fermi energies is finally stressed.

  7. Tool use as distributed cognition: how tools help, hinder and define manual skill.

    PubMed

    Baber, Chris; Parekh, Manish; Cengiz, Tulin G

    2014-01-01

    Our thesis in this paper is that, in order to appreciate the interplay between cognitive (goal-directed) and physical performance in tool use, it is necessary to determine the role that representations play in the use of tools. We argue that rather being solely a matter of internal (mental) representation, tool use makes use of the external representations that define the human-environment-tool-object system. This requires the notion of Distributed Cognition to encompass not simply the manner in which artifacts represent concepts but also how they represent praxis. Our argument is that this can be extended to include how artifacts-in-context afford use and how this response to affordances constitutes a particular form of skilled performance. By artifacts-in-context, we do not mean solely the affordances offered by the physical dimensions of a tool but also the interaction between the tool and the object that it is being used on. From this, "affordance" does not simply relate to the physical appearance of the tool but anticipates subsequent actions by the user directed towards the goal of changing the state of the object and this is best understood in terms of the "complimentarity" in the system. This assertion raises two challenges which are explored in this paper. The first is to distinguish "affordance" from the adaptation that one might expect to see in descriptions of motor control; when we speak of "affordance" as a form of anticipation, don't we just mean the ability to adjust movements in response to physical demands? The second is to distinguish "affordance" from a schema of the tool; when we talk about anticipation, don't we just mean the ability to call on a schema representing a "recipe" for using that tool for that task? This question of representation, specifically what knowledge needs to be represented in tool use, is central to this paper.

  8. Educational software usability: Artifact or Design?

    PubMed

    Van Nuland, Sonya E; Eagleson, Roy; Rogers, Kem A

    2017-03-01

    Online educational technologies and e-learning tools are providing new opportunities for students to learn worldwide, and they continue to play an important role in anatomical sciences education. Yet, as we shift to teaching online, particularly within the anatomical sciences, it has become apparent that e-learning tool success is based on more than just user satisfaction and preliminary learning outcomes-rather it is a multidimensional construct that should be addressed from an integrated perspective. The efficiency, effectiveness and satisfaction with which a user can navigate an e-learning tool is known as usability, and represents a construct which we propose can be used to quantitatively evaluate e-learning tool success. To assess the usability of an e-learning tool, usability testing should be employed during the design and development phases (i.e., prior to its release to users) as well as during its delivery (i.e., following its release to users). However, both the commercial educational software industry and individual academic developers in the anatomical sciences have overlooked the added value of additional usability testing. Reducing learner frustration and anxiety during e-learning tool use is essential in ensuring e-learning tool success, and will require a commitment on the part of the developers to engage in usability testing during all stages of an e-learning tool's life cycle. Anat Sci Educ 10: 190-199. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  9. Capturing the essence of a metabolic network: a flux balance analysis approach.

    PubMed

    Murabito, Ettore; Simeonidis, Evangelos; Smallbone, Kieran; Swinton, Jonathan

    2009-10-07

    As genome-scale metabolic reconstructions emerge, tools to manage their size and complexity will be increasingly important. Flux balance analysis (FBA) is a constraint-based approach widely used to study the metabolic capabilities of cellular or subcellular systems. FBA problems are highly underdetermined and many different phenotypes can satisfy any set of constraints through which the metabolic system is represented. Two of the main concerns in FBA are exploring the space of solutions for a given metabolic network and finding a specific phenotype which is representative for a given task such as maximal growth rate. Here, we introduce a recursive algorithm suitable for overcoming both of these concerns. The method proposed is able to find the alternate optimal patterns of active reactions of an FBA problem and identify the minimal subnetwork able to perform a specific task as optimally as the whole. Our method represents an alternative to and an extension of other approaches conceived for exploring the space of solutions of an FBA problem. It may also be particularly helpful in defining a scaffold of reactions upon which to build up a dynamic model, when the important pathways of the system have not yet been well-defined.

  10. An overview of systematic review.

    PubMed

    Baker, Kathy A; Weeks, Susan Mace

    2014-12-01

    Systematic review is an invaluable tool for the practicing clinician. A well-designed systematic review represents the latest and most complete information available on a particular topic or intervention. This article highlights the key elements of systematic review, what it is and is not, and provides an overview of several reputable organizations supporting the methodological development and conduct of systematic review. Important aspects for evaluating the quality of a systematic review are also included. Copyright © 2014 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  11. Behavioural cues of reproductive status in seahorses Hippocampus abdominalis.

    PubMed

    Whittington, C M; Musolf, K; Sommer, S; Wilson, A B

    2013-07-01

    A method is described to assess the reproductive status of male Hippocampus abdominalis on the basis of behavioural traits. The non-invasive nature of this technique minimizes handling stress and reduces sampling requirements for experimental work. It represents a useful tool to assist researchers in sample collection for studies of reproduction and development in viviparous syngnathids, which are emerging as important model species. © 2013 The Authors. Journal of Fish Biology © 2013 The Fisheries Society of the British Isles.

  12. Chiral phosphoric acid catalysis: from numbers to insights.

    PubMed

    Maji, Rajat; Mallojjala, Sharath Chandra; Wheeler, Steven E

    2018-02-19

    Chiral phosphoric acids (CPAs) have emerged as powerful organocatalysts for asymmetric reactions, and applications of computational quantum chemistry have revealed important insights into the activity and selectivity of these catalysts. In this tutorial review, we provide an overview of computational tools at the disposal of computational organic chemists and demonstrate their application to a wide array of CPA catalysed reactions. Predictive models of the stereochemical outcome of these reactions are discussed along with specific examples of representative reactions and an outlook on remaining challenges in this area.

  13. Toward mapping the biology of the genome.

    PubMed

    Chanock, Stephen

    2012-09-01

    This issue of Genome Research presents new results, methods, and tools from The ENCODE Project (ENCyclopedia of DNA Elements), which collectively represents an important step in moving beyond a parts list of the genome and promises to shape the future of genomic research. This collection sheds light on basic biological questions and frames the current debate over the optimization of tools and methodological challenges necessary to compare and interpret large complex data sets focused on how the genome is organized and regulated. In a number of instances, the authors have highlighted the strengths and limitations of current computational and technical approaches, providing the community with useful standards, which should stimulate development of new tools. In many ways, these papers will ripple through the scientific community, as those in pursuit of understanding the "regulatory genome" will heavily traverse the maps and tools. Similarly, the work should have a substantive impact on how genetic variation contributes to specific diseases and traits by providing a compendium of functional elements for follow-up study. The success of these papers should not only be measured by the scope of the scientific insights and tools but also by their ability to attract new talent to mine existing and future data.

  14. The use of typed lambda calculus for comprehension and construction of simulation models in the domain of ecology

    NASA Technical Reports Server (NTRS)

    Uschold, Michael

    1992-01-01

    We are concerned with two important issues in simulation modelling: model comprehension and model construction. Model comprehension is limited because many important choices taken during the modelling process are not documented. This makes it difficult for models to be modified or used by others. A key factor hindering model construction is the vast modelling search space which must be navigated. This is exacerbated by the fact that many modellers are unfamiliar with the terms and concepts catered to by current tools. The root of both problems is the lack of facilities for representing or reasoning about domain concepts in current simulation technology. The basis for our achievements in both of these areas is the development of a language with two distinct levels; one for representing domain information, and the other for representing the simulation model. Of equal importance, is the fact that we make formal connections between these two levels. The domain we are concerned with is ecological modelling. This language, called Elklogic, is based on the typed lambda calculus. Important features include a rich type structure, the use of various higher order functions, and semantics. This enables complex expressions to be constructed from relatively few primitives. The meaning of each expression can be determined in terms of the domain, the simulation model, or the relationship between the two. We describe a novel representation for sets and substructure, and a variety of other general concepts that are especially useful in the ecological domain. We use the type structure in a novel way: for controlling the modelling search space, rather than a proof search space. We facilitate model comprehension by representing modelling decisions that are embodied in the simulation model. We represent the simulation model separately from, but in terms of a domain mode. The explicit links between the two models constitute the modelling decisions. The semantics of Elklogic enables English text to be generated to explain the simulation model in domain terms.

  15. Risk determination after an acute myocardial infarction: review of 3 clinical risk prediction tools.

    PubMed

    Scruth, Elizabeth Ann; Page, Karen; Cheng, Eugene; Campbell, Michelle; Worrall-Carter, Linda

    2012-01-01

    The objective of the study was to provide comprehensive information for the clinical nurse specialist (CNS) on commonly used clinical prediction (risk assessment) tools used to estimate risk of a secondary cardiac or noncardiac event and mortality in patients undergoing primary percutaneous coronary intervention (PCI) for ST-elevation myocardial infarction (STEMI). The evolution and widespread adoption of primary PCI represent major advances in the treatment of acute myocardial infarction, specifically STEMI. The American College of Cardiology and the American Heart Association have recommended early risk stratification for patients presenting with acute coronary syndromes using several clinical risk scores to identify patients' mortality and secondary event risk after PCI. Clinical nurse specialists are integral to any performance improvement strategy. Their knowledge and understandings of clinical prediction tools will be essential in carrying out important assessment, identifying and managing risk in patients who have sustained a STEMI, and enhancing discharge education including counseling on medications and lifestyle changes. Over the past 2 decades, risk scores have been developed from clinical trials to facilitate risk assessment. There are several risk scores that can be used to determine in-hospital and short-term survival. This article critiques the most common tools: the Thrombolytic in Myocardial Infarction risk score, the Global Registry of Acute Coronary Events risk score, and the Controlled Abciximab and Device Investigation to Lower Late Angioplasty Complications risk score. The importance of incorporating risk screening assessment tools (that are important for clinical prediction models) to guide therapeutic management of patients cannot be underestimated. The ability to forecast secondary risk after a STEMI will assist in determining which patients would require the most aggressive level of treatment and monitoring postintervention including outpatient monitoring. With an increased awareness of specialist assessment tools, the CNS can play an important role in risk prevention and ongoing cardiovascular health promotion in patients diagnosed with STEMI. Knowledge of clinical prediction tools to estimate risk for mortality and risk of secondary events after PCI for acute coronary syndromes including STEMI is essential for the CNS in assisting with improving short- and long-term outcomes and for performance improvement strategies. The risk score assessment utilizing a collaborative approach with the multidisciplinary healthcare team provides for the development of a treatment plan including any invasive intervention strategy for the patient. Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins

  16. a New Tool for Facilitating the Retrieval and Recording of the Place Name Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Bozzini, C.; Conedera, M.; Krebs, P.

    2013-07-01

    Traditional place names (toponyms) represent the immaterial cultural heritage of past land uses, particular characteristics of the territory, landscape related events or inhabitants, as well as related cultural and religious background. In Euopean countries where the cultural landscape has a very long history, this heritage is particularly considerable. Often most of the detailed knowledge about traditional place names and their precise localization is non-written and familiar only to old local native persons who experienced the former rural civilization. In the next future this important heritage will be seriously threatened because of the physical disappearance of its living custodians. One of the major problems that one has to face, when trying to trace and document the knowledge related to place names and their localization, is to translate the memory and the former landscape experiences of the respondents into maps and structured records. In this contribution we present a new tool based on the monoplotting principle and ad hoc developed to enable the synchronization of terrestrial oblique landscape pictures with the represented digital elevation model. The local respondents are then just asked to show the place name localization on historical landscape pictures they are familiar with. The tool automatically gives back the corresponding world coordinates, what makes the interviewing process more rapid and smooth as well as motivating and less stress-inducing for the informants.

  17. Codifference as a practical tool to measure interdependence

    NASA Astrophysics Data System (ADS)

    Wyłomańska, Agnieszka; Chechkin, Aleksei; Gajda, Janusz; Sokolov, Igor M.

    2015-03-01

    Correlation and spectral analysis represent the standard tools to study interdependence in statistical data. However, for the stochastic processes with heavy-tailed distributions such that the variance diverges, these tools are inadequate. The heavy-tailed processes are ubiquitous in nature and finance. We here discuss codifference as a convenient measure to study statistical interdependence, and we aim to give a short introductory review of its properties. By taking different known stochastic processes as generic examples, we present explicit formulas for their codifferences. We show that for the Gaussian processes codifference is equivalent to covariance. For processes with finite variance these two measures behave similarly with time. For the processes with infinite variance the covariance does not exist, however, the codifference is relevant. We demonstrate the practical importance of the codifference by extracting this function from simulated as well as real data taken from turbulent plasma of fusion device and financial market. We conclude that the codifference serves as a convenient practical tool to study interdependence for stochastic processes with both infinite and finite variances as well.

  18. A Middle Palaeolithic wooden digging stick from Aranbaltza III, Spain.

    PubMed

    Rios-Garaizar, Joseba; López-Bultó, Oriol; Iriarte, Eneko; Pérez-Garrido, Carlos; Piqué, Raquel; Aranburu, Arantza; Iriarte-Chiapusso, María José; Ortega-Cordellat, Illuminada; Bourguignon, Laurence; Garate, Diego; Libano, Iñaki

    2018-01-01

    Aranbaltza is an archaeological complex formed by at least three open-air sites. Between 2014 and 2015 a test excavation carried out in Aranbaltza III revealed the presence of a sand and clay sedimentary sequence formed in floodplain environments, within which six sedimentary units have been identified. This sequence was formed between 137-50 ka, and includes several archaeological horizons, attesting to the long-term presence of Neanderthal communities in this area. One of these horizons, corresponding with Unit 4, yielded two wooden tools. One of these tools is a beveled pointed tool that was shaped through a complex operational sequence involving branch shaping, bark peeling, twig removal, shaping, polishing, thermal exposition and chopping. A use-wear analysis of the tool shows it to have traces related with digging soil so it has been interpreted as representing a digging stick. This is the first time such a tool has been identified in a European Late Middle Palaeolithic context; it also represents one of the first well-preserved Middle Palaeolithic wooden tool found in southern Europe. This artefact represents one of the few examples available of wooden tool preservation for the European Palaeolithic, allowing us to further explore the role wooden technologies played in Neanderthal communities.

  19. Current Challenges in Geothermal Reservoir Simulation

    NASA Astrophysics Data System (ADS)

    Driesner, T.

    2016-12-01

    Geothermal reservoir simulation has long been introduced as a valuable tool for geothermal reservoir management and research. Yet, the current generation of simulation tools faces a number of severe challenges, in particular in the application for novel types of geothermal resources such as supercritical reservoirs or hydraulic stimulation. This contribution reviews a number of key problems: Representing the magmatic heat source of high enthalpy resources in simulations. Current practice is representing the deeper parts of a high enthalpy reservoir by a heat flux or temperature boundary condition. While this is sufficient for many reservoir management purposes it precludes exploring the chances of very high enthalpy resources in the deepest parts of such systems as well as the development of reliable conceptual models. Recent 2D simulations with the CSMP++ simulation platform demonstrate the potential of explicitly including the heat source, namely for understanding supercritical resources. Geometrically realistic incorporation of discrete fracture networks in simulation. A growing number of simulation tools can, in principle, handle flow and heat transport in discrete fracture networks. However, solving the governing equations and representing the physical properties are often biased by introducing strongly simplifying assumptions. Including proper fracture mechanics in complex fracture network simulations remains an open challenge. Improvements of the simulating chemical fluid-rock interaction in geothermal reservoirs. Major improvements have been made towards more stable and faster numerical solvers for multicomponent chemical fluid rock interaction. However, the underlying thermodynamic models and databases are unable to correctly address a number of important regions in temperature-pressure-composition parameter space. Namely, there is currently no thermodynamic formalism to describe relevant chemical reactions in supercritical reservoirs. Overcoming this unsatisfactory situation requires fundamental research in high temperature physical chemistry rather than further numerical development.

  20. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    PubMed

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an algorithm for clustering nodes within a network based solely on their topological similarities. Using GraphCrunch 2, we demonstrate that eukaryotic and viral PPI networks may belong to different graph model families and show that topology-based clustering can reveal important functional similarities between proteins within yeast and human PPI networks. GraphCrunch 2 is a software tool that implements the latest research on biological network analysis. It parallelizes computationally intensive tasks to fully utilize the potential of modern multi-core CPUs. It is open-source and freely available for research use. It runs under the Windows and Linux platforms.

  1. Tool use as distributed cognition: how tools help, hinder and define manual skill

    PubMed Central

    Baber, Chris; Parekh, Manish; Cengiz, Tulin G.

    2014-01-01

    Our thesis in this paper is that, in order to appreciate the interplay between cognitive (goal-directed) and physical performance in tool use, it is necessary to determine the role that representations play in the use of tools. We argue that rather being solely a matter of internal (mental) representation, tool use makes use of the external representations that define the human–environment–tool–object system. This requires the notion of Distributed Cognition to encompass not simply the manner in which artifacts represent concepts but also how they represent praxis. Our argument is that this can be extended to include how artifacts-in-context afford use and how this response to affordances constitutes a particular form of skilled performance. By artifacts-in-context, we do not mean solely the affordances offered by the physical dimensions of a tool but also the interaction between the tool and the object that it is being used on. From this, “affordance” does not simply relate to the physical appearance of the tool but anticipates subsequent actions by the user directed towards the goal of changing the state of the object and this is best understood in terms of the “complimentarity” in the system. This assertion raises two challenges which are explored in this paper. The first is to distinguish “affordance” from the adaptation that one might expect to see in descriptions of motor control; when we speak of “affordance” as a form of anticipation, don’t we just mean the ability to adjust movements in response to physical demands? The second is to distinguish “affordance” from a schema of the tool; when we talk about anticipation, don’t we just mean the ability to call on a schema representing a “recipe” for using that tool for that task? This question of representation, specifically what knowledge needs to be represented in tool use, is central to this paper. PMID:24605103

  2. A biological tool to assess flow connectivity in reference temporary streams from the Mediterranean Basin.

    PubMed

    Cid, N; Verkaik, I; García-Roger, E M; Rieradevall, M; Bonada, N; Sánchez-Montoya, M M; Gómez, R; Suárez, M L; Vidal-Abarca, M R; Demartini, D; Buffagni, A; Erba, S; Karaouzas, I; Skoulikidis, N; Prat, N

    2016-01-01

    Many streams in the Mediterranean Basin have temporary flow regimes. While timing for seasonal drought is predictable, they undergo strong inter-annual variability in flow intensity. This high hydrological variability and associated ecological responses challenge the ecological status assessment of temporary streams, particularly when setting reference conditions. This study examined the effects of flow connectivity in aquatic macroinvertebrates from seven reference temporary streams across the Mediterranean Basin where hydrological variability and flow conditions are well studied. We tested for the effect of flow cessation on two streamflow indices and on community composition, and, by performing random forest and classification tree analyses we identified important biological predictors for classifying the aquatic state either as flowing or disconnected pools. Flow cessation was critical for one of the streamflow indices studied and for community composition. Macroinvertebrate families found to be important for classifying the aquatic state were Hydrophilidae, Simuliidae, Hydropsychidae, Planorbiidae, Heptageniidae and Gerridae. For biological traits, trait categories associated to feeding habits, food, locomotion and substrate relation were the most important and provided more accurate predictions compared to taxonomy. A combination of selected metrics and associated thresholds based on the most important biological predictors (i.e. Bio-AS Tool) were proposed in order to assess the aquatic state in reference temporary streams, especially in the absence of hydrological data. Although further development is needed, the tool can be of particular interest for monitoring, restoration, and conservation purposes, representing an important step towards an adequate management of temporary rivers not only in the Mediterranean Basin but also in other regions vulnerable to the effects of climate change. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Representing Human Expertise by the OWL Web Ontology Language to Support Knowledge Engineering in Decision Support Systems.

    PubMed

    Ramzan, Asia; Wang, Hai; Buckingham, Christopher

    2014-01-01

    Clinical decision support systems (CDSSs) often base their knowledge and advice on human expertise. Knowledge representation needs to be in a format that can be easily understood by human users as well as supporting ongoing knowledge engineering, including evolution and consistency of knowledge. This paper reports on the development of an ontology specification for managing knowledge engineering in a CDSS for assessing and managing risks associated with mental-health problems. The Galatean Risk and Safety Tool, GRiST, represents mental-health expertise in the form of a psychological model of classification. The hierarchical structure was directly represented in the machine using an XML document. Functionality of the model and knowledge management were controlled using attributes in the XML nodes, with an accompanying paper manual for specifying how end-user tools should behave when interfacing with the XML. This paper explains the advantages of using the web-ontology language, OWL, as the specification, details some of the issues and problems encountered in translating the psychological model to OWL, and shows how OWL benefits knowledge engineering. The conclusions are that OWL can have an important role in managing complex knowledge domains for systems based on human expertise without impeding the end-users' understanding of the knowledge base. The generic classification model underpinning GRiST makes it applicable to many decision domains and the accompanying OWL specification facilitates its implementation.

  4. Machine assisted histogram classification

    NASA Astrophysics Data System (ADS)

    Benyó, B.; Gaspar, C.; Somogyi, P.

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  5. Cognitive mapping tools: review and risk management needs.

    PubMed

    Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor

    2012-08-01

    Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.

  6. Thermal Characterization of Carbon Nanotubes by Photothermal Techniques

    NASA Astrophysics Data System (ADS)

    Leahu, G.; Li Voti, R.; Larciprete, M. C.; Sibilia, C.; Bertolotti, M.; Nefedov, I.; Anoshkin, I. V.

    2015-06-01

    Carbon nanotubes (CNTs) are multifunctional materials commonly used in a large number of applications in electronics, sensors, nanocomposites, thermal management, actuators, energy storage and conversion, and drug delivery. Despite recent important advances in the development of CNT purity assessment tools and atomic resolution imaging of individual nanotubes by scanning tunnelling microscopy and high-resolution transmission electron microscopy, the macroscale assessment of the overall surface qualities of commercial CNT materials remains a great challenge. The lack of quantitative measurement technology to characterize and compare the surface qualities of bulk manufactured and engineered CNT materials has negative impacts on the reliable and consistent nanomanufacturing of CNT products. In this paper it is shown how photoacoustic spectroscopy and photothermal radiometry represent useful non-destructive tools to study the optothermal properties of carbon nanotube thin films.

  7. The AstroVR Collaboratory, an On-line Multi-User Environment for Research in Astrophysics

    NASA Astrophysics Data System (ADS)

    van Buren, D.; Curtis, P.; Nichols, D. A.; Brundage, M.

    We describe our experiment with an on-line collaborative environment where users share the execution of programs and communicate via audio, video, and typed text. Collaborative environments represent the next step in computer-mediated conferencing, combining powerful compute engines, data persistence, shared applications, and teleconferencing tools. As proof of concept, we have implemented a shared image analysis tool, allowing geographically distinct users to analyze FITS images together. We anticipate that \\htmllink{AstroVR}{http://astrovr.ipac.caltech.edu:8888} and similar systems will become an important part of collaborative work in the next decade, including with applications in remote observing, spacecraft operations, on-line meetings, as well as and day-to-day research activities. The technology is generic and promises to find uses in business, medicine, government, and education.

  8. Front-line ordering clinicians: matching workforce to workload.

    PubMed

    Fieldston, Evan S; Zaoutis, Lisa B; Hicks, Patricia J; Kolb, Susan; Sladek, Erin; Geiger, Debra; Agosto, Paula M; Boswinkel, Jan P; Bell, Louis M

    2014-07-01

    Matching workforce to workload is particularly important in healthcare delivery, where an excess of workload for the available workforce may negatively impact processes and outcomes of patient care and resident learning. Hospitals currently lack a means to measure and match dynamic workload and workforce factors. This article describes our work to develop and obtain consensus for use of an objective tool to dynamically match the front-line ordering clinician (FLOC) workforce to clinical workload in a variety of inpatient settings. We undertook development of a tool to represent hospital workload and workforce based on literature reviews, discussions with clinical leadership, and repeated validation sessions. We met with physicians and nurses from every clinical care area of our large, urban children's hospital at least twice. We successfully created a tool in a matrix format that is objective and flexible and can be applied to a variety of settings. We presented the tool in 14 hospital divisions and received widespread acceptance among physician, nursing, and administrative leadership. The hospital uses the tool to identify gaps in FLOC coverage and guide staffing decisions. Hospitals can better match workload to workforce if they can define and measure these elements. The Care Model Matrix is a flexible, objective tool that quantifies the multidimensional aspects of workload and workforce. The tool, which uses multiple variables that are easily modifiable, can be adapted to a variety of settings. © 2014 Society of Hospital Medicine.

  9. Entrainment in the master equation.

    PubMed

    Margaliot, Michael; Grüne, Lars; Kriecherbauer, Thomas

    2018-04-01

    The master equation plays an important role in many scientific fields including physics, chemistry, systems biology, physical finance and sociodynamics. We consider the master equation with periodic transition rates. This may represent an external periodic excitation like the 24 h solar day in biological systems or periodic traffic lights in a model of vehicular traffic. Using tools from systems and control theory, we prove that under mild technical conditions every solution of the master equation converges to a periodic solution with the same period as the rates. In other words, the master equation entrains (or phase locks) to periodic excitations. We describe two applications of our theoretical results to important models from statistical mechanics and epidemiology.

  10. Natriuretic Peptides as Biomarkers in Heart Failure

    PubMed Central

    Januzzi, James L.

    2014-01-01

    Following the initial discovery of a natriuretic and diuretic peptide factor present in atrial myocardial tissue homogenates, subsequent elucidation of the natriuretic peptide family has led to substantial advances in the understanding of the autocrine, paracrine, and endocrine regulation of the cardiovascular system. Furthermore, with the development of assays for the measurement of the natriuretic peptides, these important biomarkers have gone from being regarded as biological mediators of the cardiovascular system to now represent important clinical tools for the diagnostic and prognostic evaluation of patients with heart failure, and may have potential as a therapeutic target in this setting as well. An historical perspective on the natriuretic peptides from bench to bedside translation will be discussed. PMID:23661103

  11. Entrainment in the master equation

    PubMed Central

    Grüne, Lars; Kriecherbauer, Thomas

    2018-01-01

    The master equation plays an important role in many scientific fields including physics, chemistry, systems biology, physical finance and sociodynamics. We consider the master equation with periodic transition rates. This may represent an external periodic excitation like the 24 h solar day in biological systems or periodic traffic lights in a model of vehicular traffic. Using tools from systems and control theory, we prove that under mild technical conditions every solution of the master equation converges to a periodic solution with the same period as the rates. In other words, the master equation entrains (or phase locks) to periodic excitations. We describe two applications of our theoretical results to important models from statistical mechanics and epidemiology. PMID:29765669

  12. Interaction of memory systems during acquisition of tool knowledge and skills in Parkinson's disease.

    PubMed

    Roy, Shumita; Park, Norman W; Roy, Eric A; Almeida, Quincy J

    2015-01-01

    Previous research suggests that different aspects of tool knowledge are mediated by different memory systems. It is believed that tool attributes (e.g., function, color) are represented as declarative memory while skill learning is supported by procedural memory. It has been proposed that other aspects (e.g., skilled tool use) may rely on an interaction of both declarative and procedural memory. However, the specific form of procedural memory underlying skilled tool use and the nature of interaction between declarative and procedural memory systems remain unclear. In the current study, individuals with Parkinson's disease (PD) and healthy controls were trained over 2 sessions, 3 weeks apart, to use a set of novel complex tools. They were also tested on their ability to recall tool attributes as well as their ability to demonstrate grasp and use of the tools to command. Results showed that, compared to controls, participants with PD showed intact motor skill acquisition and tool use to command within sessions, but failed to retain performance across sessions. In contrast, people with PD showed equivalent recall of tool attributes and tool grasping relative to controls, both within and across sessions. Current findings demonstrate that the frontal-striatal network, compromised in PD, mediates long-term retention of motor skills. Intact initial skill learning raises the possibility of compensation from declarative memory for frontal-striatal dysfunction. Lastly, skilled tool use appears to rely on both memory systems which may reflect a cooperative interaction between the two systems. Current findings regarding memory representations of tool knowledge and skill learning may have important implications for delivery of rehabilitation programs for individuals with PD. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Evaluation of Radiation Belt Space Weather Forecasts for Internal Charging Analyses

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Coffey, Victoria N.; Jun, Insoo; Garrett, Henry B.

    2007-01-01

    A variety of static electron radiation belt models, space weather prediction tools, and energetic electron datasets are used by spacecraft designers and operations support personnel as internal charging code inputs to evaluate electrostatic discharge risks in space systems due to exposure to relativistic electron environments. Evaluating the environment inputs is often accomplished by comparing whether the data set or forecast tool reliability predicts measured electron flux (or fluence over a given period) for some chosen period. While this technique is useful as a model metric, it does not provide the information necessary to evaluate whether short term deviances of the predicted flux is important in the charging evaluations. In this paper, we use a 1-D internal charging model to compute electric fields generated in insulating materials as a function of time when exposed to relativistic electrons in the Earth's magnetosphere. The resulting fields are assumed to represent the "true" electric fields and are compared with electric field values computed from relativistic electron environments derived from a variety of space environment and forecast tools. Deviances in predicted fields compared to the "true" fields which depend on insulator charging time constants will be evaluated as a potential metric for determining the importance of predicted and measured relativistic electron flux deviations over a range of time scales.

  14. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  15. A dynamic simulation based water resources education tool.

    PubMed

    Williams, Alison; Lansey, Kevin; Washburne, James

    2009-01-01

    Educational tools to assist the public in recognizing impacts of water policy in a realistic context are not generally available. This project developed systems with modeling-based educational decision support simulation tools to satisfy this need. The goal of this model is to teach undergraduate students and the general public about the implications of common water management alternatives so that they can better understand or become involved in water policy and make more knowledgeable personal or community decisions. The model is based on Powersim, a dynamic simulation software package capable of producing web-accessible, intuitive, graphic, user-friendly interfaces. Modules are included to represent residential, agricultural, industrial, and turf uses, as well as non-market values, water quality, reservoir, flow, and climate conditions. Supplementary materials emphasize important concepts and lead learners through the model, culminating in an open-ended water management project. The model is used in a University of Arizona undergraduate class and within the Arizona Master Watershed Stewards Program. Evaluation results demonstrated improved understanding of concepts and system interactions, fulfilling the project's objectives.

  16. The FinFET Breakthrough and Networks of Innovation in the Semiconductor Industry, 1980-2005: Applying Digital Tools to the History of Technology.

    PubMed

    O'Reagan, Douglas; Fleming, Lee

    2018-01-01

    The "FinFET" design for transistors, developed at the University of California, Berkeley, in the 1990s, represented a major leap forward in the semiconductor industry. Understanding its origins and importance requires deep knowledge of local factors, such as the relationships among the lab's principal investigators, students, staff, and the institution. It also requires understanding this lab within the broader network of relationships that comprise the semiconductor industry-a much more difficult task using traditional historical methods, due to the paucity of sources on industrial research. This article is simultaneously 1) a history of an impactful technology and its social context, 2) an experiment in using data tools and visualizations as a complement to archival and oral history sources, to clarify and explore these "big picture" dimensions, and 3) an introduction to specific data visualization tools that we hope will be useful to historians of technology more generally.

  17. Perceived Utility of Pharmacy Licensure Examination Preparation Tools

    PubMed Central

    Peak, Amy Sutton; Sheehan, Amy Heck; Arnett, Stephanie

    2006-01-01

    Objectives To identify board examination preparation tools most commonly used by recent pharmacy graduates and determine which tools are perceived as most valuable and representative of the actual content of licensure examinations. Methods An electronic survey was sent to all 2004 graduates of colleges of pharmacy in Indiana. Participants identified which specific preparation tools were used and rated tools based on usefulness, representativeness of licensure examination, and monetary value, and provided overall recommendations to future graduates. Results The most commonly used preparation tools were the Pharmacy Law Review Session offered by Dr. Thomas Wilson at Purdue University, the Complete Review for Pharmacy, Pre-NAPLEX, PharmPrep, and the Kaplan NAPLEX Review. Tools receiving high ratings in all categories included Dr. Wilson's Pharmacy Law Review Session, Pre-NAPLEX, Comprehensive Pharmacy Review, Kaplan NAPLEX Review, and Review of Pharmacy. Conclusions Although no preparation tool was associated with a higher examination pass rate, certain tools were clearly rated higher than others by test takers. PMID:17149406

  18. A Middle Palaeolithic wooden digging stick from Aranbaltza III, Spain

    PubMed Central

    López-Bultó, Oriol; Iriarte, Eneko; Pérez-Garrido, Carlos; Piqué, Raquel; Aranburu, Arantza; Iriarte-Chiapusso, María José; Ortega-Cordellat, Illuminada; Bourguignon, Laurence; Garate, Diego; Libano, Iñaki

    2018-01-01

    Aranbaltza is an archaeological complex formed by at least three open-air sites. Between 2014 and 2015 a test excavation carried out in Aranbaltza III revealed the presence of a sand and clay sedimentary sequence formed in floodplain environments, within which six sedimentary units have been identified. This sequence was formed between 137–50 ka, and includes several archaeological horizons, attesting to the long-term presence of Neanderthal communities in this area. One of these horizons, corresponding with Unit 4, yielded two wooden tools. One of these tools is a beveled pointed tool that was shaped through a complex operational sequence involving branch shaping, bark peeling, twig removal, shaping, polishing, thermal exposition and chopping. A use-wear analysis of the tool shows it to have traces related with digging soil so it has been interpreted as representing a digging stick. This is the first time such a tool has been identified in a European Late Middle Palaeolithic context; it also represents one of the first well-preserved Middle Palaeolithic wooden tool found in southern Europe. This artefact represents one of the few examples available of wooden tool preservation for the European Palaeolithic, allowing us to further explore the role wooden technologies played in Neanderthal communities. PMID:29590205

  19. Combining multi-criteria decision analysis and mini-health technology assessment: A funding decision-support tool for medical devices in a university hospital setting.

    PubMed

    Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle

    2016-02-01

    At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Judgment skills, a missing component in health literacy: development of a tool for asthma patients in the Italian-speaking region of Switzerland.

    PubMed

    Moreno Londoño, Ana Maria; Schulz, Peter J

    2014-04-01

    Health literacy has been recognized as an important factor influencing health behaviors and health outcomes. However, its definition is still evolving, and the tools available for its measurement are limited in scope. Based on the conceptualization of health literacy within the Health Empowerment Model, the present study developed and validated a tool to assess patient's health knowledge use, within the context of asthma self-management. A review of scientific literature on asthma self-management, and several interviews with pulmonologists and asthma patients were conducted. From these, 19 scenarios with 4 response options each were drafted and assembled in a scenario-based questionnaire. Furthermore, a three round Delphi procedure was carried out, to validate the tool with the participation of 12 specialists in lung diseases. The face and content validity of the tool were achieved by face-to-face interviews with 2 pulmonologists and 5 patients. Consensus among the specialists on the adequacy of the response options was achieved after the three round Delphi procedure. The final tool has a 0.97 intra-class correlation coefficient (ICC), indicating a strong level of agreement among experts on the ratings of the response options. The ICC for single scenarios, range from 0.92 to 0.99. The newly developed tool provides a final score representing patient's health knowledge use, based on the specialist's consensus. This tool contributes to enriching the measurement of a more advanced health literacy dimension.

  1. Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice

    NASA Astrophysics Data System (ADS)

    Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.

    2013-10-01

    Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d

  2. Measuring the progress of capacity building in the Alberta Policy Coalition for Cancer Prevention.

    PubMed

    Raine, Kim D; Sosa Hernandez, Cristabel; Nykiforuk, Candace I J; Reed, Shandy; Montemurro, Genevieve; Lytvyak, Ellina; MacLellan-Wright, Mary-Frances

    2014-07-01

    The Alberta Policy Coalition for Cancer Prevention (APCCP) represents practitioners, policy makers, researchers, and community organizations working together to coordinate efforts and advocate for policy change to reduce chronic diseases. The aim of this research was to capture changes in the APCCP's capacity to advance its goals over the course of its operation. We adapted the Public Health Agency of Canada's validated Community Capacity-Building Tool to capture policy work. All members of the APCCP were invited to complete the tool in 2010 and 2011. Responses were analyzed using descriptive statistics and t tests. Qualitative comments were analyzed using thematic content analysis. A group process for reaching consensus provided context to the survey responses and contributed to a participatory analysis. Significant improvement was observed in eight out of nine capacity domains. Lessons learned highlight the importance of balancing volume and diversity of intersectoral representation to ensure effective participation, as well as aligning professional and economic resources. Defining involvement and roles within a coalition can be a challenging activity contingent on the interests of each sector represented. The participatory analysis enabled the group to reflect on progress made and future directions for policy advocacy. © 2013 Society for Public Health Education.

  3. Using the Lorenz Curve to Characterize Risk Predictiveness and Etiologic Heterogeneity

    PubMed Central

    Mauguen, Audrey; Begg, Colin B.

    2017-01-01

    The Lorenz curve is a graphical tool that is used widely in econometrics. It represents the spread of a probability distribution, and its traditional use has been to characterize population distributions of wealth or income, or more specifically, inequalities in wealth or income. However, its utility in public health research has not been broadly established. The purpose of this article is to explain its special usefulness for characterizing the population distribution of disease risks, and in particular for identifying the precise disease burden that can be predicted to occur in segments of the population that are known to have especially high (or low) risks, a feature that is important for evaluating the yield of screening or other disease prevention initiatives. We demonstrate that, although the Lorenz curve represents the distribution of predicted risks in a population at risk for the disease, in fact it can be estimated from a case–control study conducted in the population without the need for information on absolute risks. We explore two different estimation strategies and compare their statistical properties using simulations. The Lorenz curve is a statistical tool that deserves wider use in public health research. PMID:27096256

  4. Comparison of Yeasts as Hosts for Recombinant Protein Production.

    PubMed

    Vieira Gomes, Antonio Milton; Souza Carmo, Talita; Silva Carvalho, Lucas; Mendonça Bahia, Frederico; Parachin, Nádia Skorupa

    2018-04-29

    Recombinant protein production emerged in the early 1980s with the development of genetic engineering tools, which represented a compelling alternative to protein extraction from natural sources. Over the years, a high level of heterologous protein was made possible in a variety of hosts ranging from the bacteria Escherichia coli to mammalian cells. Recombinant protein importance is represented by its market size, which reached $1654 million in 2016 and is expected to reach $2850.5 million by 2022. Among the available hosts, yeasts have been used for producing a great variety of proteins applied to chemicals, fuels, food, and pharmaceuticals, being one of the most used hosts for recombinant production nowadays. Historically, Saccharomyces cerevisiae was the dominant yeast host for heterologous protein production. Lately, other yeasts such as Komagataella sp., Kluyveromyces lactis , and Yarrowia lipolytica have emerged as advantageous hosts. In this review, a comparative analysis is done listing the advantages and disadvantages of using each host regarding the availability of genetic tools, strategies for cultivation in bioreactors, and the main techniques utilized for protein purification. Finally, examples of each host will be discussed regarding the total amount of protein recovered and its bioactivity due to correct folding and glycosylation patterns.

  5. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures.

    PubMed

    Ceroni, Alessio; Dell, Anne; Haslam, Stuart M

    2007-08-07

    Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other applications to create intuitive and appealing user interfaces: an example is the "GlycoWorkbench", a software tool for assisted annotation of glycan mass spectra. The "GlycanBuilder" represent a flexible, reliable and efficient solution to the problem of input and output of glycan structures in any glycomic tool or database.

  6. Biophysical EPR Studies Applied to Membrane Proteins

    PubMed Central

    Sahu, Indra D; Lorigan, Gary A

    2015-01-01

    Membrane proteins are very important in controlling bioenergetics, functional activity, and initializing signal pathways in a wide variety of complicated biological systems. They also represent approximately 50% of the potential drug targets. EPR spectroscopy is a very popular and powerful biophysical tool that is used to study the structural and dynamic properties of membrane proteins. In this article, a basic overview of the most commonly used EPR techniques and examples of recent applications to answer pertinent structural and dynamic related questions on membrane protein systems will be presented. PMID:26855825

  7. On Animating 2D Velocity Fields

    NASA Technical Reports Server (NTRS)

    Kao, David; Pang, Alex; Yan, Jerry (Technical Monitor)

    2001-01-01

    A velocity field, even one that represents a steady state flow, implies a dynamical system. Animated velocity fields is an important tool in understanding such complex phenomena. This paper looks at a number of techniques that animate velocity fields and propose two new alternatives. These are texture advection and streamline cycling. The common theme among these techniques is the use of advection on some texture to generate a realistic animation of the velocity field. Texture synthesis and selection for these methods are presented. Strengths and weaknesses of the techniques are also discussed in conjunctions with several examples.

  8. On Animating 2D Velocity Fields

    NASA Technical Reports Server (NTRS)

    Kao, David; Pang, Alex

    2000-01-01

    A velocity field. even one that represents a steady state flow implies a dynamical system. Animated velocity fields is an important tool in understanding such complex phenomena. This paper looks at a number of techniques that animate velocity fields and propose two new alternatives, These are texture advection and streamline cycling. The common theme among these techniques is the use of advection on some texture to generate a realistic animation of the velocity field. Texture synthesis and selection for these methods are presented. Strengths and weaknesses of the techniques are also discussed in conjunction with several examples.

  9. Multiple utility constrained multi-objective programs using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Abbasian, Pooneh; Mahdavi-Amiri, Nezam; Fazlollahtabar, Hamed

    2018-03-01

    A utility function is an important tool for representing a DM's preference. We adjoin utility functions to multi-objective optimization problems. In current studies, usually one utility function is used for each objective function. Situations may arise for a goal to have multiple utility functions. Here, we consider a constrained multi-objective problem with each objective having multiple utility functions. We induce the probability of the utilities for each objective function using Bayesian theory. Illustrative examples considering dependence and independence of variables are worked through to demonstrate the usefulness of the proposed model.

  10. Microscopic analysis of Hopper flow with ellipsoidal particles

    NASA Astrophysics Data System (ADS)

    Liu, Sida; Zhou, Zongyan; Zou, Ruiping; Pinson, David; Yu, Aibing

    2013-06-01

    Hoppers are widely used in process industries. With such widespread application, difficulties in achieving desired operational behaviors have led to extensive experimental and mathematical studies in the past decades. Particularly, the discrete element method has become one of the most important simulation tools for design and analysis. So far, most studies are on spherical particles for computational convenience. In this work, ellipsoidal particles are used as they can represent a large variation of particle shapes. Hopper flow with ellipsoidal particles is presented highlighting the effect of particle shape on the microscopic properties.

  11. Large eddy simulation modeling of particle-laden flows in complex terrain

    NASA Astrophysics Data System (ADS)

    Salesky, S.; Giometto, M. G.; Chamecki, M.; Lehning, M.; Parlange, M. B.

    2017-12-01

    The transport, deposition, and erosion of heavy particles over complex terrain in the atmospheric boundary layer is an important process for hydrology, air quality forecasting, biology, and geomorphology. However, in situ observations can be challenging in complex terrain due to spatial heterogeneity. Furthermore, there is a need to develop numerical tools that can accurately represent the physics of these multiphase flows over complex surfaces. We present a new numerical approach to accurately model the transport and deposition of heavy particles in complex terrain using large eddy simulation (LES). Particle transport is represented through solution of the advection-diffusion equation including terms that represent gravitational settling and inertia. The particle conservation equation is discretized in a cut-cell finite volume framework in order to accurately enforce mass conservation. Simulation results will be validated with experimental data, and numerical considerations required to enforce boundary conditions at the surface will be discussed. Applications will be presented in the context of snow deposition and transport, as well as urban dispersion.

  12. Study of the therapeutic benefit of cationic copolymer administration to vascular endothelium under mechanical stress

    PubMed Central

    Giantsos-Adams, Kristina; Lopez-Quintero, Veronica; Kopeckova, Pavla; Kopecek, Jindrich; Tarbell, John M.; Dull, Randal

    2015-01-01

    Pulmonary edema and the associated increases in vascular permeability continue to represent a significant clinical problem in the intensive care setting, with no current treatment modality other than supportive care and mechanical ventilation. Therapeutic compound(s) capable of attenuating changes in vascular barrier function would represent a significant advance in critical care medicine. We have previously reported the development of HPMA-based copolymers, targeted to endothelial glycocalyx that are able to enhance barrier function. In this work, we report the refinement of copolymer design and extend our physiological studies todemonstrate that the polymers: 1) reduce both shear stress and pressure-mediated increase in hydraulic conductivity, 2) reduce nitric oxide production in response to elevated hydrostatic pressure and, 3) reduce the capillary filtration coefficient (Kfc) in an isolated perfused mouse lung model. These copolymers represent an important tool for use in mechanotransduction research and a novel strategy for developing clinically useful copolymers for the treatment of vascular permeability. PMID:20932573

  13. A novel method for automated assessment of megakaryocyte differentiation and proplatelet formation.

    PubMed

    Salzmann, M; Hoesel, B; Haase, M; Mussbacher, M; Schrottmaier, W C; Kral-Pointner, J B; Finsterbusch, M; Mazharian, A; Assinger, A; Schmid, J A

    2018-06-01

    Transfusion of platelet concentrates represents an important treatment for various bleeding complications. However, the short half-life and frequent contaminations with bacteria restrict the availability of platelet concentrates and raise a clear demand for platelets generated ex vivo. Therefore, in vitro platelet generation from megakaryocytes represents an important research topic. A vital step for this process represents accurate analysis of thrombopoiesis and proplatelet formation, which is usually conducted manually. We aimed to develop a novel method for automated classification and analysis of proplatelet-forming megakaryocytes in vitro. After fluorescent labelling of surface and nucleus, MKs were automatically categorized and analysed with a novel pipeline of the open source software CellProfiler. Our new workflow is able to detect and quantify four subtypes of megakaryocytes undergoing thrombopoiesis: proplatelet-forming, spreading, pseudopodia-forming and terminally differentiated, anucleated megakaryocytes. Furthermore, we were able to characterize the inhibitory effect of dasatinib on thrombopoiesis in more detail. Our new workflow enabled rapid, unbiased, quantitative and qualitative in-depth analysis of proplatelet formation based on morphological characteristics. Clinicians and basic researchers alike will benefit from this novel technique that allows reliable and unbiased quantification of proplatelet formation. It thereby provides a valuable tool for the development of methods to generate platelets ex vivo and to detect effects of drugs on megakaryocyte differentiation.

  14. Modelling entomological-climatic interactions of Plasmodium falciparum malaria transmission in two Colombian endemic-regions: contributions to a National Malaria Early Warning System

    PubMed Central

    Ruiz, Daniel; Poveda, Germán; Vélez, Iván D; Quiñones, Martha L; Rúa, Guillermo L; Velásquez, Luz E; Zuluaga, Juan S

    2006-01-01

    Background Malaria has recently re-emerged as a public health burden in Colombia. Although the problem seems to be climate-driven, there remain significant gaps of knowledge in the understanding of the complexity of malaria transmission, which have motivated attempts to develop a comprehensive model. Methods The mathematical tool was applied to represent Plasmodium falciparum malaria transmission in two endemic-areas. Entomological exogenous variables were estimated through field campaigns and laboratory experiments. Availability of breeding places was included towards representing fluctuations in vector densities. Diverse scenarios, sensitivity analyses and instabilities cases were considered during experimentation-validation process. Results Correlation coefficients and mean square errors between observed and modelled incidences reached 0.897–0.668 (P > 0.95) and 0.0002–0.0005, respectively. Temperature became the most relevant climatic parameter driving the final incidence. Accordingly, malaria outbreaks are possible during the favourable epochs following the onset of El Niño warm events. Sporogonic and gonotrophic cycles showed to be the entomological key-variables controlling the transmission potential of mosquitoes' population. Simulation results also showed that seasonality of vector density becomes an important factor towards understanding disease transmission. Conclusion The model constitutes a promising tool to deepen the understanding of the multiple interactions related to malaria transmission conducive to outbreaks. In the foreseeable future it could be implemented as a tool to diagnose possible dynamical patterns of malaria incidence under several scenarios, as well as a decision-making tool for the early detection and control of outbreaks. The model will be also able to be merged with forecasts of El Niño events to provide a National Malaria Early Warning System. PMID:16882349

  15. Modelling entomological-climatic interactions of Plasmodium falciparum malaria transmission in two Colombian endemic-regions: contributions to a National Malaria Early Warning System.

    PubMed

    Ruiz, Daniel; Poveda, Germán; Vélez, Iván D; Quiñones, Martha L; Rúa, Guillermo L; Velásquez, Luz E; Zuluaga, Juan S

    2006-08-01

    Malaria has recently re-emerged as a public health burden in Colombia. Although the problem seems to be climate-driven, there remain significant gaps of knowledge in the understanding of the complexity of malaria transmission, which have motivated attempts to develop a comprehensive model. The mathematical tool was applied to represent Plasmodium falciparum malaria transmission in two endemic-areas. Entomological exogenous variables were estimated through field campaigns and laboratory experiments. Availability of breeding places was included towards representing fluctuations in vector densities. Diverse scenarios, sensitivity analyses and instabilities cases were considered during experimentation-validation process. Correlation coefficients and mean square errors between observed and modelled incidences reached 0.897-0.668 (P > 0.95) and 0.0002-0.0005, respectively. Temperature became the most relevant climatic parameter driving the final incidence. Accordingly, malaria outbreaks are possible during the favourable epochs following the onset of El Niño warm events. Sporogonic and gonotrophic cycles showed to be the entomological key-variables controlling the transmission potential of mosquitoes' population. Simulation results also showed that seasonality of vector density becomes an important factor towards understanding disease transmission. The model constitutes a promising tool to deepen the understanding of the multiple interactions related to malaria transmission conducive to outbreaks. In the foreseeable future it could be implemented as a tool to diagnose possible dynamical patterns of malaria incidence under several scenarios, as well as a decision-making tool for the early detection and control of outbreaks. The model will be also able to be merged with forecasts of El Niño events to provide a National Malaria Early Warning System.

  16. Malaria parasite mutants with altered erythrocyte permeability: a new drug resistance mechanism and important molecular tool

    PubMed Central

    Hill, David A; Desai, Sanjay A

    2010-01-01

    Erythrocytes infected with plasmodia, including those that cause human malaria, have increased permeability to a diverse collection of organic and inorganic solutes. While these increases have been known for decades, their mechanistic basis was unclear until electrophysiological studies revealed flux through one or more ion channels on the infected erythrocyte membrane. Current debates have centered on the number of distinct ion channels, which channels mediate the transport of each solute and whether the channels represent parasite-encoded proteins or human channels activated after infection. This article reviews the identification of the plasmodial surface anion channel and other proposed channels with an emphasis on two distinct channel mutants generated through in vitro selection. These mutants implicate parasite genetic elements in the parasite-induced permeability, reveal an important new antimalarial drug resistance mechanism and provide tools for molecular studies. We also critically examine the technical issues relevant to the detection of ion channels by electrophysiological methods; these technical considerations have general applicability for interpreting studies of various ion channels proposed for the infected erythrocyte membrane. PMID:20020831

  17. Aerobiology: Experimental Considerations, Observations, and Future Tools

    PubMed Central

    Haddrell, Allen E.

    2017-01-01

    ABSTRACT Understanding airborne survival and decay of microorganisms is important for a range of public health and biodefense applications, including epidemiological and risk analysis modeling. Techniques for experimental aerosol generation, retention in the aerosol phase, and sampling require careful consideration and understanding so that they are representative of the conditions the bioaerosol would experience in the environment. This review explores the current understanding of atmospheric transport in relation to advances and limitations of aerosol generation, maintenance in the aerosol phase, and sampling techniques. Potential tools for the future are examined at the interface between atmospheric chemistry, aerosol physics, and molecular microbiology where the heterogeneity and variability of aerosols can be explored at the single-droplet and single-microorganism levels within a bioaerosol. The review highlights the importance of method comparison and validation in bioaerosol research and the benefits that the application of novel techniques could bring to increasing the understanding of aerobiological phenomena in diverse research fields, particularly during the progression of atmospheric transport, where complex interdependent physicochemical and biological processes occur within bioaerosol particles. PMID:28667111

  18. Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR

    NASA Technical Reports Server (NTRS)

    Corpaccioli, Luca; Linskens, Harry; Komar, David R.

    2014-01-01

    The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.

  19. Dual phylogenetic staining protocol for simultaneous analysis of yeast and bacteria in artworks

    NASA Astrophysics Data System (ADS)

    González-Pérez, Marina; Brinco, Catarina; Vieira, Ricardo; Rosado, Tânia; Mauran, Guilhem; Pereira, António; Candeias, António; Caldeira, Ana Teresa

    2017-02-01

    The detection and analysis of metabolically active microorganisms are useful to determine those directly involved in the biodeterioration of cultural heritage (CH). Fluorescence in situ hybridization with oligonucleotide probes targeted at rRNA (RNA-FISH) has demonstrated to be a powerful tool for signaling them. However, more efforts are required for the technique to become a vital tool for the analysis of CH's microbiological communities. Simultaneous analysis of microorganisms belonging to different kingdoms, by RNA-FISH in-suspension approach, could represent an important progress: it could open the door for the future use of the technique to analyze the microbial communities by flow cytometry, which has shown to be a potent tool in environmental microbiology. Thus, in this work, various already implemented in-suspension RNA-FISH protocols for ex situ analysis of yeast and bacteria were investigated and adapted for allowing the simultaneous detection of these types of microorganisms. A deep investigation of the factors that can affect the results was carried out, focusing particular attention on the selection of the fluorochromes used for labelling the probe set. The resultant protocol, involving the use of EUK516-6-FAM/EUB338-Cy3 probes combination, was validated using artificial consortia and gave positive preliminary results when applied in samples from a real case study: the Paleolithic archaeological site of Escoural Cave (Alentejo, Portugal). This approach represents the first dual-staining RNA-FISH in-suspension protocol developed and applied for the simultaneous investigation of CH biodeteriogenic agents belonging to different kingdoms.

  20. Standardized observation of neighbourhood disorder: does it work in Canada?

    PubMed Central

    2010-01-01

    Background There is a growing body of evidence that where you live is important to your health. Despite numerous previous studies investigating the relationship between neighbourhood deprivation (and structure) and residents' health, the precise nature of this relationship remains unclear. Relatively few investigations have relied on direct observation of neighbourhoods, while those that have were developed primarily in US settings. Evaluation of the transferability of such tools to other contexts is an important first step before applying such instruments to the investigation of health and well-being. This study evaluated the performance of a systematic social observational (SSO) tool (adapted from previous studies of American and British neighbourhoods) in a Canadian urban context. Methods This was a mixed-methods study. Quantitative SSO ratings and qualitative descriptions of 176 block faces were obtained in six Toronto neighbourhoods (4 low-income, and 2 middle/high-income) by trained raters. Exploratory factor analysis was conducted with the quantitative SSO ratings. Content analysis consisted of independent coding of qualitative data by three members of the research team to yield common themes and categories. Results Factor analysis identified three factors (physical decay/disorder, social accessibility, recreational opportunities), but only 'physical decay/disorder' reflected previous findings in the literature. Qualitative results (based on raters' fieldwork experiences) revealed the tool's shortcomings in capturing important features of the neighbourhoods under study, and informed interpretation of the quantitative findings. Conclusions This study tested the performance of an SSO tool in a Canadian context, which is an important initial step before applying it to the study of health and disease. The tool demonstrated important shortcomings when applied to six diverse Toronto neighbourhoods. The study's analyses challenge previously held assumptions (e.g. social 'disorder') regarding neighbourhood social and built environments. For example, neighbourhood 'order' has traditionally been assumed to be synonymous with a certain degree of homogeneity, however the neighbourhoods under study were characterized by high degrees of heterogeneity and low levels of disorder. Heterogeneity was seen as an appealing feature of a block face. Employing qualitative techniques with SSO represents a unique contribution, enhancing both our understanding of the quantitative ratings obtained and of neighbourhood characteristics that are not currently captured by such instruments. PMID:20146821

  1. Trade-Off Analysis between Concerns Based on Aspect-Oriented Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Laurito, Abelyn Methanie R.; Takada, Shingo

    The identification of functional and non-functional concerns is an important activity during requirements analysis. However, there may be conflicts between the identified concerns, and they must be discovered and resolved through trade-off analysis. Aspect-Oriented Requirements Engineering (AORE) has trade-off analysis as one of its goals, but most AORE approaches do not actually offer support for trade-off analysis; they focus on describing concerns and generating their composition. This paper proposes an approach for trade-off analysis based on AORE using use cases and the Requirements Conflict Matrix (RCM) to represent compositions. RCM shows the positive or negative effect of non-functional concerns over use cases and other non-functional concerns. Our approach is implemented within a tool called E-UCEd (Extended Use Case Editor). We also show the results of evaluating our tool.

  2. Evaluation of medical management during a mass casualty incident exercise: an objective assessment tool to enhance direct observation.

    PubMed

    Ingrassia, Pier Luigi; Prato, Federico; Geddo, Alessandro; Colombo, Davide; Tengattini, Marco; Calligaro, Sara; La Mura, Fabrizio; Franc, Jeffrey Michael; Della Corte, Francesco

    2010-11-01

    Functional exercises represent an important link between disaster planning and disaster response. Although these exercises are widely performed, no standardized method exists for their evaluation. To describe a simple and objective method to assess medical performance during functional exercise events. An evaluation tool comprising three data fields (triage, clinical maneuvers, and radio usage), accompanied by direct anecdotal observational methods, was used to evaluate a large functional mass casualty incident exercise. Seventeen medical responders managed 112 victims of a simulated building explosion. Although 81% of the patients were assigned the appropriate triage codes, evacuation from the site did not follow in priority. Required maneuvers were performed correctly in 85.2% of airway maneuvers and 78.7% of breathing maneuvers, however, significant under-treatment occurred, possibly due to equipment shortages. Extensive use of radio communication was documented. In evaluating this tool, the structured markers were informative, but further information provided by direct observation was invaluable. A three-part tool (triage, medical maneuvers, and radio usage) can provide a method to evaluate functional mass casualty incident exercises, and is easily implemented. For the best results, it should be used in conjunction with direct observation. The evaluation tool has great potential as a reproducible and internationally recognized tool for evaluating disaster management exercises. Copyright © 2010 Elsevier Inc. All rights reserved.

  3. On the evaluation of segmentation editing tools

    PubMed Central

    Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.

    2014-01-01

    Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063

  4. Developing a Multidisciplinary Team for Disorders of Sex Development: Planning, Implementation, and Operation Tools for Care Providers

    PubMed Central

    Moran, Mary Elizabeth; Karkazis, Katrina

    2012-01-01

    In the treatment of patients with disorders of sex development (DSD), multidisciplinary teams (MDTs) represent a new standard of care. While DSDs are too complex for care to be delivered effectively without specialized team management, these conditions are often considered to be too rare for their medical management to be a hospital priority. Many specialists involved in DSD care want to create a clinic or team, but there is no available guidance that bridges the gap between a group of like-minded DSD providers who want to improve care and the formation of a functional MDT. This is an important dilemma, and one with serious implications for the future of DSD care. If a network of multidisciplinary DSD teams is to be a reality, those directly involved in DSD care must be given the necessary program planning and team implementation tools. This paper offers a protocol and set of tools to meet this need. We present a 6-step process to team formation, and a sample set of tools that can be used to guide, develop, and evaluate a team throughout the course of its operation. PMID:22792098

  5. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  6. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  7. Conserved meiotic machinery in Glomus spp., a putatively ancient asexual fungal lineage.

    PubMed

    Halary, Sébastien; Malik, Shehre-Banoo; Lildhar, Levannia; Slamovits, Claudio H; Hijri, Mohamed; Corradi, Nicolas

    2011-01-01

    Arbuscular mycorrhizal fungi (AMF) represent an ecologically important and evolutionarily intriguing group of symbionts of land plants, currently thought to have propagated clonally for over 500 Myr. AMF produce multinucleate spores and may exchange nuclei through anastomosis, but meiosis has never been observed in this group. A provocative alternative for their successful and long asexual evolutionary history is that these organisms may have cryptic sex, allowing them to recombine alleles and compensate for deleterious mutations. This is partly supported by reports of recombination among some of their natural populations. We explored this hypothesis by searching for some of the primary tools for a sustainable sexual cycle--the genes whose products are required for proper completion of meiotic recombination in yeast--in the genomes of four AMF and compared them with homologs of representative ascomycete, basidiomycete, chytridiomycete, and zygomycete fungi. Our investigation used molecular and bioinformatic tools to identify homologs of 51 meiotic genes, including seven meiosis-specific genes and other "core meiotic genes" conserved in the genomes of the AMF Glomus diaphanum (MUCL 43196), Glomus irregulare (DAOM-197198), Glomus clarum (DAOM 234281), and Glomus cerebriforme (DAOM 227022). Homology of AMF meiosis-specific genes was verified by phylogenetic analyses with representative fungi, animals (Mus, Hydra), and a choanoflagellate (Monosiga). Together, these results indicate that these supposedly ancient asexual fungi may be capable of undergoing a conventional meiosis; a hypothesis that is consistent with previous reports of recombination within and across some of their populations.

  8. Weather-Related Flood and Landslide Damage: A Risk Index for Italian Regions

    PubMed Central

    Messeri, Alessandro; Morabito, Marco; Messeri, Gianni; Brandani, Giada; Petralli, Martina; Natali, Francesca; Grifoni, Daniele; Crisci, Alfonso; Gensini, Gianfranco; Orlandini, Simone

    2015-01-01

    The frequency of natural hazards has been increasing in the last decades in Europe and specifically in Mediterranean regions due to climate change. For example heavy precipitation events can lead to disasters through the interaction with exposed and vulnerable people and natural systems. It is therefore necessary a prevention planning to preserve human health and to reduce economic losses. Prevention should mainly be carried out with more adequate land management, also supported by the development of an appropriate risk prediction tool based on weather forecasts. The main aim of this study is to investigate the relationship between weather types (WTs) and the frequency of floods and landslides that have caused damage to properties, personal injuries, or deaths in the Italian regions over recent decades. In particular, a specific risk index (WT-FLARI) for each WT was developed at national and regional scale. This study has identified a specific risk index associated with each weather type, calibrated for each Italian region and applicable to both annual and seasonal levels. The risk index represents the seasonal and annual vulnerability of each Italian region and indicates that additional preventive actions are necessary for some regions. The results of this study represent a good starting point towards the development of a tool to support policy-makers, local authorities and health agencies in planning actions, mainly in the medium to long term, aimed at the weather damage reduction that represents an important issue of the World Meteorological Organization mission. PMID:26714309

  9. Weather-Related Flood and Landslide Damage: A Risk Index for Italian Regions.

    PubMed

    Messeri, Alessandro; Morabito, Marco; Messeri, Gianni; Brandani, Giada; Petralli, Martina; Natali, Francesca; Grifoni, Daniele; Crisci, Alfonso; Gensini, Gianfranco; Orlandini, Simone

    2015-01-01

    The frequency of natural hazards has been increasing in the last decades in Europe and specifically in Mediterranean regions due to climate change. For example heavy precipitation events can lead to disasters through the interaction with exposed and vulnerable people and natural systems. It is therefore necessary a prevention planning to preserve human health and to reduce economic losses. Prevention should mainly be carried out with more adequate land management, also supported by the development of an appropriate risk prediction tool based on weather forecasts. The main aim of this study is to investigate the relationship between weather types (WTs) and the frequency of floods and landslides that have caused damage to properties, personal injuries, or deaths in the Italian regions over recent decades. In particular, a specific risk index (WT-FLARI) for each WT was developed at national and regional scale. This study has identified a specific risk index associated with each weather type, calibrated for each Italian region and applicable to both annual and seasonal levels. The risk index represents the seasonal and annual vulnerability of each Italian region and indicates that additional preventive actions are necessary for some regions. The results of this study represent a good starting point towards the development of a tool to support policy-makers, local authorities and health agencies in planning actions, mainly in the medium to long term, aimed at the weather damage reduction that represents an important issue of the World Meteorological Organization mission.

  10. Conserved Meiotic Machinery in Glomus spp., a Putatively Ancient Asexual Fungal Lineage

    PubMed Central

    Halary, Sébastien; Malik, Shehre-Banoo; Lildhar, Levannia; Slamovits, Claudio H.; Hijri, Mohamed; Corradi, Nicolas

    2011-01-01

    Arbuscular mycorrhizal fungi (AMF) represent an ecologically important and evolutionarily intriguing group of symbionts of land plants, currently thought to have propagated clonally for over 500 Myr. AMF produce multinucleate spores and may exchange nuclei through anastomosis, but meiosis has never been observed in this group. A provocative alternative for their successful and long asexual evolutionary history is that these organisms may have cryptic sex, allowing them to recombine alleles and compensate for deleterious mutations. This is partly supported by reports of recombination among some of their natural populations. We explored this hypothesis by searching for some of the primary tools for a sustainable sexual cycle—the genes whose products are required for proper completion of meiotic recombination in yeast—in the genomes of four AMF and compared them with homologs of representative ascomycete, basidiomycete, chytridiomycete, and zygomycete fungi. Our investigation used molecular and bioinformatic tools to identify homologs of 51 meiotic genes, including seven meiosis-specific genes and other “core meiotic genes” conserved in the genomes of the AMF Glomus diaphanum (MUCL 43196), Glomus irregulare (DAOM-197198), Glomus clarum (DAOM 234281), and Glomus cerebriforme (DAOM 227022). Homology of AMF meiosis-specific genes was verified by phylogenetic analyses with representative fungi, animals (Mus, Hydra), and a choanoflagellate (Monosiga). Together, these results indicate that these supposedly ancient asexual fungi may be capable of undergoing a conventional meiosis; a hypothesis that is consistent with previous reports of recombination within and across some of their populations. PMID:21876220

  11. Tensor Spectral Clustering for Partitioning Higher-order Network Structures.

    PubMed

    Benson, Austin R; Gleich, David F; Leskovec, Jure

    2015-01-01

    Spectral graph theory-based methods represent an important class of tools for studying the structure of networks. Spectral methods are based on a first-order Markov chain derived from a random walk on the graph and thus they cannot take advantage of important higher-order network substructures such as triangles, cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering (TSC) algorithm that allows for modeling higher-order network structures in a graph partitioning framework. Our TSC algorithm allows the user to specify which higher-order network structures (cycles, feed-forward loops, etc.) should be preserved by the network clustering. Higher-order network structures of interest are represented using a tensor, which we then partition by developing a multilinear spectral method. Our framework can be applied to discovering layered flows in networks as well as graph anomaly detection, which we illustrate on synthetic networks. In directed networks, a higher-order structure of particular interest is the directed 3-cycle, which captures feedback loops in networks. We demonstrate that our TSC algorithm produces large partitions that cut fewer directed 3-cycles than standard spectral clustering algorithms.

  12. Tensor Spectral Clustering for Partitioning Higher-order Network Structures

    PubMed Central

    Benson, Austin R.; Gleich, David F.; Leskovec, Jure

    2016-01-01

    Spectral graph theory-based methods represent an important class of tools for studying the structure of networks. Spectral methods are based on a first-order Markov chain derived from a random walk on the graph and thus they cannot take advantage of important higher-order network substructures such as triangles, cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering (TSC) algorithm that allows for modeling higher-order network structures in a graph partitioning framework. Our TSC algorithm allows the user to specify which higher-order network structures (cycles, feed-forward loops, etc.) should be preserved by the network clustering. Higher-order network structures of interest are represented using a tensor, which we then partition by developing a multilinear spectral method. Our framework can be applied to discovering layered flows in networks as well as graph anomaly detection, which we illustrate on synthetic networks. In directed networks, a higher-order structure of particular interest is the directed 3-cycle, which captures feedback loops in networks. We demonstrate that our TSC algorithm produces large partitions that cut fewer directed 3-cycles than standard spectral clustering algorithms. PMID:27812399

  13. An obesity/cardiometabolic risk reduction disease management program: a population-based approach.

    PubMed

    Villagra, Victor G

    2009-04-01

    Obesity is a critical health concern that has captured the attention of public and private healthcare payers who are interested in controlling costs and mitigating the long-term economic consequences of the obesity epidemic. Population-based approaches to obesity management have been proposed that take advantage of a chronic care model (CCM), including patient self-care, the use of community-based resources, and the realization of care continuity through ongoing communications with patients, information technology, and public policy changes. Payer-sponsored disease management programs represent an important conduit to delivering population-based care founded on similar CCM concepts. Disease management is founded on population-based disease identification, evidence-based care protocols, and collaborative practices between clinicians. While substantial clinician training, technology infrastructure commitments, and financial support at the payer level will be needed for the success of disease management programs in obesity and cardiometabolic risk reduction, these barriers can be overcome with the proper commitment. Disease management programs represent an important tool to combat the growing societal risks of overweight and obesity.

  14. Membrane proteins structures: A review on computational modeling tools.

    PubMed

    Almeida, Jose G; Preto, Antonio J; Koukos, Panagiotis I; Bonvin, Alexandre M J J; Moreira, Irina S

    2017-10-01

    Membrane proteins (MPs) play diverse and important functions in living organisms. They constitute 20% to 30% of the known bacterial, archaean and eukaryotic organisms' genomes. In humans, their importance is emphasized as they represent 50% of all known drug targets. Nevertheless, experimental determination of their three-dimensional (3D) structure has proven to be both time consuming and rather expensive, which has led to the development of computational algorithms to complement the available experimental methods and provide valuable insights. This review highlights the importance of membrane proteins and how computational methods are capable of overcoming challenges associated with their experimental characterization. It covers various MP structural aspects, such as lipid interactions, allostery, and structure prediction, based on methods such as Molecular Dynamics (MD) and Machine-Learning (ML). Recent developments in algorithms, tools and hybrid approaches, together with the increase in both computational resources and the amount of available data have resulted in increasingly powerful and trustworthy approaches to model MPs. Even though MPs are elementary and important in nature, the determination of their 3D structure has proven to be a challenging endeavor. Computational methods provide a reliable alternative to experimental methods. In this review, we focus on computational techniques to determine the 3D structure of MP and characterize their binding interfaces. We also summarize the most relevant databases and software programs available for the study of MPs. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Comparative investigations of manual action representations: evidence that chimpanzees represent the costs of potential future actions involving tools.

    PubMed

    Frey, Scott H; Povinelli, Daniel J

    2012-01-12

    The ability to adjust one's ongoing actions in the anticipation of forthcoming task demands is considered as strong evidence for the existence of internal action representations. Studies of action selection in tool use reveal that the behaviours that we choose in the present moment differ depending on what we intend to do next. Further, they point to a specialized role for mechanisms within the human cerebellum and dominant left cerebral hemisphere in representing the likely sensory costs of intended future actions. Recently, the question of whether similar mechanisms exist in other primates has received growing, but still limited, attention. Here, we present data that bear on this issue from a species that is a natural user of tools, our nearest living relative, the chimpanzee. In experiment 1, a subset of chimpanzees showed a non-significant tendency for their grip preferences to be affected by anticipation of the demands associated with bringing a tool's baited end to their mouths. In experiment 2, chimpanzees' initial grip preferences were consistently affected by anticipation of the forthcoming movements in a task that involves using a tool to extract a food reward. The partial discrepancy between the results of these two studies is attributed to the ability to accurately represent differences between the motor costs associated with executing the two response alternatives available within each task. These findings suggest that chimpanzees are capable of accurately representing the costs of intended future actions, and using those predictions to select movements in the present even in the context of externally directed tool use.

  16. Comparative investigations of manual action representations: evidence that chimpanzees represent the costs of potential future actions involving tools

    PubMed Central

    Frey, Scott H.; Povinelli, Daniel J.

    2012-01-01

    The ability to adjust one's ongoing actions in the anticipation of forthcoming task demands is considered as strong evidence for the existence of internal action representations. Studies of action selection in tool use reveal that the behaviours that we choose in the present moment differ depending on what we intend to do next. Further, they point to a specialized role for mechanisms within the human cerebellum and dominant left cerebral hemisphere in representing the likely sensory costs of intended future actions. Recently, the question of whether similar mechanisms exist in other primates has received growing, but still limited, attention. Here, we present data that bear on this issue from a species that is a natural user of tools, our nearest living relative, the chimpanzee. In experiment 1, a subset of chimpanzees showed a non-significant tendency for their grip preferences to be affected by anticipation of the demands associated with bringing a tool's baited end to their mouths. In experiment 2, chimpanzees' initial grip preferences were consistently affected by anticipation of the forthcoming movements in a task that involves using a tool to extract a food reward. The partial discrepancy between the results of these two studies is attributed to the ability to accurately represent differences between the motor costs associated with executing the two response alternatives available within each task. These findings suggest that chimpanzees are capable of accurately representing the costs of intended future actions, and using those predictions to select movements in the present even in the context of externally directed tool use. PMID:22106426

  17. Putting more power in your pocket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Karena

    Representing the Northeastern Center for Chemical Energy Storage (NECCES), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of NECCEC is to identify the key atomic-scale processes which govern electrode functionmore » in rechargeable batteries, over a wide range of time and length scales, via the development and use of novel characterization and theoretical tools, and to use this information to identify and design new battery systems.« less

  18. Exosomes Secreted by HeLa Cells Shuttle on Their Surface the Plasma Membrane-Associated Sialidase NEU3.

    PubMed

    Paolini, Lucia; Orizio, Flavia; Busatto, Sara; Radeghieri, Annalisa; Bresciani, Roberto; Bergese, Paolo; Monti, Eugenio

    2017-12-05

    Sialidases are glycohydrolases that remove terminal sialic acid residues from oligosaccharides, glycolipids, and glycoproteins. The plasma membrane-associated sialidase NEU3 is involved in the fine-tuning of sialic acid-containing glycans directly on the cell surface and plays relevant roles in important biological phenomena such as cell differentiation, molecular recognition, and cancer transformation. Extracellular vesicles are membranous structures with a diameter of 0.03-1 μm released by cells and can be detected in blood, urine, and culture media. Among extracellular vesicles, exosomes play roles in intercellular communication and maintenance of several physiological and pathological conditions, including cancer, and could represent a useful diagnostic tool for personalized nanomedicine approaches. Using inducible expression of the murine form of NEU3 in HeLa cells, a study of the association of the enzyme with exosomes released in the culture media has been performed. Briefly, NEU3 is associated with highly purified exosomes and localizes on the external leaflet of these nanovesicles, as demonstrated by enzyme activity measurements, Western blot analysis, and dot blot analysis using specific protein markers. On the basis of these results, it is plausible that NEU3 activity on exosome glycans enhances the dynamic biological behavior of these small extracellular vesicles by modifying the negative charge and steric hindrance of their glycocalyx. The presence of NEU3 on the exosomal surface could represent a useful marker for the detection of these nanovesicles and a tool for improving our understanding of the biology of these important extracellular carriers in physiological and pathological conditions.

  19. Surface immuno-functionalisation for the capture and detection of Vibrio species in the marine environment: a new management tool for industrial facilities.

    PubMed

    Laczka, Olivier F; Labbate, Maurizio; Seymour, Justin R; Bourne, David G; Fielder, Stewart S; Doblin, Martina A

    2014-01-01

    Bacteria from the genus Vibrio are a common and environmentally important group of bacteria within coastal environments and include species pathogenic to aquaculture organisms. Their distribution and abundance are linked to specific environmental parameters, including temperature, salinity and nutrient enrichment. Accurate and efficient detection of Vibrios in environmental samples provides a potential important indicator of overall ecosystem health while also allowing rapid management responses for species pathogenic to humans or species implicated in disease of economically important aquacultured fish and invertebrates. In this study, we developed a surface immuno-functionalisation protocol, based on an avidin-biotin type covalent binding strategy, allowing specific sandwich-type detection of bacteria from the Vibrio genus. The assay was optimized on 12 diverse Vibrio strains, including species that have implications for aquaculture industries, reaching detection limits between 7×10(3) to 3×10(4) cells mL(-1). Current techniques for the detection of total Vibrios rely on laborious or inefficient analyses resulting in delayed management decisions. This work represents a novel approach for a rapid, accurate, sensitive and robust tool for quantifying Vibrios directly in industrial systems and in the environment, thereby facilitating rapid management responses.

  20. The In-Space Propulsion Technology Project Low-Thrust Trajectory Tool Suite

    NASA Technical Reports Server (NTRS)

    Dankanich, John W.

    2008-01-01

    The ISPT project released its low-thrust trajectory tool suite in March of 2006. The LTTT suite tools range in capabilities, but represent the state-of-the art in NASA low-thrust trajectory optimization tools. The tools have all received considerable updates following the initial release, and they are available through their respective development centers or the ISPT project website.

  1. Ergonomic principles and tools for best interdisciplinary psycho-physical stress prevention.

    PubMed

    Dal Cason, Dott Luigi

    2012-01-01

    The psycho-physical stress is a risk to all intents and purposes,finally acknowledged, it requires increasing attention. Measures forits protection are reflected in the appropriate application of organizational policies on a human scale, or in respect of the"macro-ergonomics". This work consists on several inter-disciplinary tools available to the proper prevention, outbreaks of work-related stress.During work, adequate rests are important to prevent work related physical and mental fatigue. The strategies for maintaining a healthy balance between work rate and work breaks, may differ depending on the individual, subjective habits and peculiarities related to the work environment. Resting does not necessarily mean "going to break". The break-time is important as the work-time. While the latter is regulated, the first is not always clearly defined, though necessary. Knowing the employment contract is the first step towards the implementation of their rights relating to periods of suspension from the activity of work is also essential for high performance working. Breathing exercises, massage therapy, biofeedback, role-playing are some of the tools used during work breaks to prevent mental and physical fatigue. At the end music has a rhythm by alternating strong and weak accents. If the musical notes represent the "vertical" trend of music (melody), figures and pauses, inserted into the rhythmic structure of the measure, regulate the duration of sounds over time and determine the "horizontal" trend of a song. Transferring this concept on work, is meant to understand, using a metaphor, the importance of respect of changes in both vertical and horizontal trends inside a cycle.

  2. Robust Detection of Rare Species Using Environmental DNA: The Importance of Primer Specificity

    PubMed Central

    Wilcox, Taylor M.; McKelvey, Kevin S.; Young, Michael K.; Jane, Stephen F.; Lowe, Winsor H.; Whiteley, Andrew R.; Schwartz, Michael K.

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probe-based chemistries may represent a particularly powerful tool because of the method’s sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence of closely related, sympatric taxa. If related species cause any cross-amplification or interference, false positives and negatives may be generated. These errors can be disastrous if false positives lead to overestimate the abundance of an endangered species or if false negatives prevent detection of an invasive species. In this study we test factors that influence the specificity and sensitivity of TaqMan MGB assays using co-occurring, closely related brook trout (Salvelinus fontinalis) and bull trout (S. confluentus) as a case study. We found qPCR to be substantially more sensitive than traditional PCR, with a high probability of detection at concentrations as low as 0.5 target copies/µl. We also found that number and placement of base pair mismatches between the Taqman MGB assay and non-target templates was important to target specificity, and that specificity was most influenced by base pair mismatches in the primers, rather than in the probe. We found that insufficient specificity can result in both false positive and false negative results, particularly in the presence of abundant related species. Our results highlight the utility of qPCR as a highly sensitive eDNA tool, and underscore the importance of careful assay design. PMID:23555689

  3. Robust detection of rare species using environmental DNA: the importance of primer specificity.

    PubMed

    Wilcox, Taylor M; McKelvey, Kevin S; Young, Michael K; Jane, Stephen F; Lowe, Winsor H; Whiteley, Andrew R; Schwartz, Michael K

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probe-based chemistries may represent a particularly powerful tool because of the method's sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence of closely related, sympatric taxa. If related species cause any cross-amplification or interference, false positives and negatives may be generated. These errors can be disastrous if false positives lead to overestimate the abundance of an endangered species or if false negatives prevent detection of an invasive species. In this study we test factors that influence the specificity and sensitivity of TaqMan MGB assays using co-occurring, closely related brook trout (Salvelinus fontinalis) and bull trout (S. confluentus) as a case study. We found qPCR to be substantially more sensitive than traditional PCR, with a high probability of detection at concentrations as low as 0.5 target copies/µl. We also found that number and placement of base pair mismatches between the Taqman MGB assay and non-target templates was important to target specificity, and that specificity was most influenced by base pair mismatches in the primers, rather than in the probe. We found that insufficient specificity can result in both false positive and false negative results, particularly in the presence of abundant related species. Our results highlight the utility of qPCR as a highly sensitive eDNA tool, and underscore the importance of careful assay design.

  4. Creating and parameterizing patient-specific deep brain stimulation pathway-activation models using the hyperdirect pathway as an example.

    PubMed

    Gunalan, Kabilar; Chaturvedi, Ashutosh; Howell, Bryan; Duchin, Yuval; Lempka, Scott F; Patriat, Remi; Sapiro, Guillermo; Harel, Noam; McIntyre, Cameron C

    2017-01-01

    Deep brain stimulation (DBS) is an established clinical therapy and computational models have played an important role in advancing the technology. Patient-specific DBS models are now common tools in both academic and industrial research, as well as clinical software systems. However, the exact methodology for creating patient-specific DBS models can vary substantially and important technical details are often missing from published reports. Provide a detailed description of the assembly workflow and parameterization of a patient-specific DBS pathway-activation model (PAM) and predict the response of the hyperdirect pathway to clinical stimulation. Integration of multiple software tools (e.g. COMSOL, MATLAB, FSL, NEURON, Python) enables the creation and visualization of a DBS PAM. An example DBS PAM was developed using 7T magnetic resonance imaging data from a single unilaterally implanted patient with Parkinson's disease (PD). This detailed description implements our best computational practices and most elaborate parameterization steps, as defined from over a decade of technical evolution. Pathway recruitment curves and strength-duration relationships highlight the non-linear response of axons to changes in the DBS parameter settings. Parameterization of patient-specific DBS models can be highly detailed and constrained, thereby providing confidence in the simulation predictions, but at the expense of time demanding technical implementation steps. DBS PAMs represent new tools for investigating possible correlations between brain pathway activation patterns and clinical symptom modulation.

  5. Management Matters: The Library Media Specialist's Management Toolbox

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2004-01-01

    Library media specialists need tools to help them manage the school library media program. The Internet includes a vast array of tools that a library media specialist might find useful. The websites and electronic resources included in this article are only a representative sample and future columns may explore additional tools. All the tools are…

  6. Development and validation of the Vietnamese primary care assessment tool.

    PubMed

    Hoa, Nguyen Thi; Tam, Nguyen Minh; Peersman, Wim; Derese, Anselme; Markuns, Jeffrey F

    2018-01-01

    To adapt the consumer version of the Primary Care Assessment Tool (PCAT) for Vietnam and determine its internal consistency and validity. A quantitative cross sectional study. 56 communes in 3 representative provinces of central Vietnam. Total of 3289 people who used health care services at health facility at least once over the past two years. The Vietnamese adult expanded consumer version of the PCAT (VN PCAT-AE) is an instrument for evaluation of primary care in Vietnam with 70 items comprising six scales representing four core primary care domains, and three additional scales representing three derivative domains. Sixteen other items from the original tool were not included in the final instrument, due to problems with missing values, floor or ceiling effects, and item-total correlations. All the retained scales have a Cronbach's alpha above 0.70 except for the subscale of Family Centeredness. The VN PCAT-AE demonstrates adequate internal consistency and validity to be used as an effective tool for measuring the quality of primary care in Vietnam from the consumer perspective. Additional work in the future to optimize valid measurement in all domains consistent with the original version of the tool may be helpful as the primary care system in Vietnam further develops.

  7. Accumulating Evidence and Research Organization (AERO) model: a new tool for representing, analyzing, and planning a translational research program.

    PubMed

    Hey, Spencer Phillips; Heilig, Charles M; Weijer, Charles

    2013-05-30

    Maximizing efficiency in drug development is important for drug developers, policymakers, and human subjects. Limited funds and the ethical imperative of risk minimization demand that researchers maximize the knowledge gained per patient-subject enrolled. Yet, despite a common perception that the current system of drug development is beset by inefficiencies, there remain few approaches for systematically representing, analyzing, and communicating the efficiency and coordination of the research enterprise. In this paper, we present the first steps toward developing such an approach: a graph-theoretic tool for representing the Accumulating Evidence and Research Organization (AERO) across a translational trajectory. This initial version of the AERO model focuses on elucidating two dimensions of robustness: (1) the consistency of results among studies with an identical or similar outcome metric; and (2) the concordance of results among studies with qualitatively different outcome metrics. The visual structure of the model is a directed acyclic graph, designed to capture these two dimensions of robustness and their relationship to three basic questions that underlie the planning of a translational research program: What is the accumulating state of total evidence? What has been the translational trajectory? What studies should be done next? We demonstrate the utility of the AERO model with an application to a case study involving the antibacterial agent, moxifloxacin, for the treatment of drug-susceptible tuberculosis. We then consider some possible elaborations for the AERO model and propose a number of ways in which the tool could be used to enhance the planning, reporting, and analysis of clinical trials. The AERO model provides an immediate visual representation of the number of studies done at any stage of research, depicting both the robustness of evidence and the relationship of each study to the larger translational trajectory. In so doing, it makes some of the invisible or inchoate properties of the research system explicit - helping to elucidate judgments about the accumulating state of evidence and supporting decision-making for future research.

  8. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education

    PubMed Central

    Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research. PMID:25469323

  9. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research.

  10. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  11. What information is used in treatment decision aids? A systematic review of the types of evidence populating health decision aids.

    PubMed

    Clifford, Amanda M; Ryan, Jean; Walsh, Cathal; McCurtin, Arlene

    2017-02-23

    Patient decision aids (DAs) are support tools designed to provide patients with relevant information to help them make informed decisions about their healthcare. While DAs can be effective in improving patient knowledge and decision quality, it is unknown what types of information and evidence are used to populate such decision tools. Systematic methods were used to identify and appraise the relevant literature and patient DAs published between 2006 and 2015. Six databases (Academic Search Complete, AMED, CINAHL, Biomedical Reference Collection, General Sciences and MEDLINE) and reference list searching were used. Articles evaluating the effectiveness of the DAs were appraised using the Cochrane Risk of Bias tool. The content, quality and sources of evidence in the decision aids were evaluated using the IPDASi-SF and a novel classification system. Findings were synthesised and a narrative analysis was performed on the results. Thirteen studies representing ten DAs met the inclusion criteria. The IPDASI-SF score ranged from 9 to 16 indicating many of the studies met the majority of quality criteria. Sources of evidence were described but reports were sometimes generic or missing important information. The majority of DAs incorporated high quality research evidence including systematic reviews and meta-analyses. Patient and practice evidence was less commonly employed, with only a third of included DAs using these to populate decision aid content. The quality of practice and patient evidence ranged from high to low. Contextual factors were addressed across all DAs to varying degrees and covered a range of factors. This is an initial study examining the information and evidence used to populate DAs. While research evidence and contextual factors are well represented in included DAs, consideration should be given to incorporating high quality information representing all four pillars of evidence based practice when developing DAs. Further, patient and expert practice evidence should be acquired rigorously and DAs should report the means by which such evidence is obtained with citations clearly provided.

  12. The 3rd DBCLS BioHackathon: improving life science data integration with Semantic Web technologies.

    PubMed

    Katayama, Toshiaki; Wilkinson, Mark D; Micklem, Gos; Kawashima, Shuichi; Yamaguchi, Atsuko; Nakao, Mitsuteru; Yamamoto, Yasunori; Okamoto, Shinobu; Oouchida, Kenta; Chun, Hong-Woo; Aerts, Jan; Afzal, Hammad; Antezana, Erick; Arakawa, Kazuharu; Aranda, Bruno; Belleau, Francois; Bolleman, Jerven; Bonnal, Raoul Jp; Chapman, Brad; Cock, Peter Ja; Eriksson, Tore; Gordon, Paul Mk; Goto, Naohisa; Hayashi, Kazuhiro; Horn, Heiko; Ishiwata, Ryosuke; Kaminuma, Eli; Kasprzyk, Arek; Kawaji, Hideya; Kido, Nobuhiro; Kim, Young Joo; Kinjo, Akira R; Konishi, Fumikazu; Kwon, Kyung-Hoon; Labarga, Alberto; Lamprecht, Anna-Lena; Lin, Yu; Lindenbaum, Pierre; McCarthy, Luke; Morita, Hideyuki; Murakami, Katsuhiko; Nagao, Koji; Nishida, Kozo; Nishimura, Kunihiro; Nishizawa, Tatsuya; Ogishima, Soichi; Ono, Keiichiro; Oshita, Kazuki; Park, Keun-Joon; Prins, Pjotr; Saito, Taro L; Samwald, Matthias; Satagopam, Venkata P; Shigemoto, Yasumasa; Smith, Richard; Splendiani, Andrea; Sugawara, Hideaki; Taylor, James; Vos, Rutger A; Withers, David; Yamasaki, Chisato; Zmasek, Christian M; Kawamoto, Shoko; Okubo, Kosaku; Asai, Kiyoshi; Takagi, Toshihisa

    2013-02-11

    BioHackathon 2010 was the third in a series of meetings hosted by the Database Center for Life Sciences (DBCLS) in Tokyo, Japan. The overall goal of the BioHackathon series is to improve the quality and accessibility of life science research data on the Web by bringing together representatives from public databases, analytical tool providers, and cyber-infrastructure researchers to jointly tackle important challenges in the area of in silico biological research. The theme of BioHackathon 2010 was the 'Semantic Web', and all attendees gathered with the shared goal of producing Semantic Web data from their respective resources, and/or consuming or interacting those data using their tools and interfaces. We discussed on topics including guidelines for designing semantic data and interoperability of resources. We consequently developed tools and clients for analysis and visualization. We provide a meeting report from BioHackathon 2010, in which we describe the discussions, decisions, and breakthroughs made as we moved towards compliance with Semantic Web technologies - from source provider, through middleware, to the end-consumer.

  13. The 3rd DBCLS BioHackathon: improving life science data integration with Semantic Web technologies

    PubMed Central

    2013-01-01

    Background BioHackathon 2010 was the third in a series of meetings hosted by the Database Center for Life Sciences (DBCLS) in Tokyo, Japan. The overall goal of the BioHackathon series is to improve the quality and accessibility of life science research data on the Web by bringing together representatives from public databases, analytical tool providers, and cyber-infrastructure researchers to jointly tackle important challenges in the area of in silico biological research. Results The theme of BioHackathon 2010 was the 'Semantic Web', and all attendees gathered with the shared goal of producing Semantic Web data from their respective resources, and/or consuming or interacting those data using their tools and interfaces. We discussed on topics including guidelines for designing semantic data and interoperability of resources. We consequently developed tools and clients for analysis and visualization. Conclusion We provide a meeting report from BioHackathon 2010, in which we describe the discussions, decisions, and breakthroughs made as we moved towards compliance with Semantic Web technologies - from source provider, through middleware, to the end-consumer. PMID:23398680

  14. Development, features and application of DIET ASSESS & PLAN (DAP) software in supporting public health nutrition research in Central Eastern European Countries (CEEC).

    PubMed

    Gurinović, Mirjana; Milešević, Jelena; Kadvan, Agnes; Nikolić, Marina; Zeković, Milica; Djekić-Ivanković, Marija; Dupouy, Eleonora; Finglas, Paul; Glibetić, Maria

    2018-01-01

    In order to meet growing public health nutrition challenges in Central Eastern European Countries (CEEC) and Balkan countries, development of a Research Infrastructure (RI) and availability of an effective nutrition surveillance system are a prerequisite. The building block of this RI is an innovative tool called DIET ASSESS & PLAN (DAP), which is a platform for standardized and harmonized food consumption collection, comprehensive dietary intake assessment and nutrition planning. Its unique structure enables application of national food composition databases (FCDBs) from the European food composition exchange platform (28 national FCDBs) developed by EuroFIR (http://www.eurofir.org/) and in addition allows communication with other tools. DAP is used for daily menu and/or long-term diet planning in diverse public sector settings, foods design/reformulation, food labelling, nutrient intake assessment and calculation of the dietary diversity indicator, Minimum Dietary Diversity-Women (MDD-W). As a validated tool in different national and international projects, DAP represents an important RI in public health nutrition epidemiology in the CEEC region. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. AgBase: supporting functional modeling in agricultural organisms

    PubMed Central

    McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.

    2011-01-01

    AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795

  16. Non-Relative Value Unit-Generating Activities Represent One-Fifth of Academic Neuroradiologist Productivity.

    PubMed

    Wintermark, M; Zeineh, M; Zaharchuk, G; Srivastava, A; Fischbein, N

    2016-07-01

    A neuroradiologist's activity includes many tasks beyond interpreting relative value unit-generating imaging studies. Our aim was to test a simple method to record and quantify the non-relative value unit-generating clinical activity represented by consults and clinical conferences, including tumor boards. Four full-time neuroradiologists, working an average of 50% clinical and 50% academic activity, systematically recorded all the non-relative value unit-generating consults and conferences in which they were involved during 3 months by using a simple, Web-based, computer-based application accessible from smartphones, tablets, or computers. The number and type of imaging studies they interpreted during the same period and the associated relative value units were extracted from our billing system. During 3 months, the 4 neuroradiologists working an average of 50% clinical activity interpreted 4241 relative value unit-generating imaging studies, representing 8152 work relative value units. During the same period, they recorded 792 non-relative value unit-generating study reviews as part of consults and conferences (not including reading room consults), representing 19% of the interpreted relative value unit-generating imaging studies. We propose a simple Web-based smartphone app to record and quantify non-relative value unit-generating activities including consults, clinical conferences, and tumor boards. The quantification of non-relative value unit-generating activities is paramount in this time of a paradigm shift from volume to value. It also represents an important tool for determining staffing levels, which cannot be performed on the basis of relative value unit only, considering the importance of time spent by radiologists on non-relative value unit-generating activities. It may also influence payment models from medical centers to radiology departments or practices. © 2016 by American Journal of Neuroradiology.

  17. Exploiting CRISPR/Cas systems for biotechnology

    PubMed Central

    Sampson, Timothy R.; Weiss, David S.

    2015-01-01

    The Cas9 endonuclease is the central component of the Type II CRISPR/Cas system, a prokaryotic adaptive restriction system against invading nucleic acids, such as those originating from bacteriophages and plasmids. Recently, this RNA-directed DNA endonuclease has been harnessed to target DNA sequences of interest. Here, we review the development of Cas9 as an important tool to not only edit the genomes of a number of different prokaryotic and eukaryotic species, but also as an efficient system for site-specific transcriptional repression or activation. Additionally, a specific Cas9 protein has been observed to target an RNA substrate, suggesting that Cas9 may have the ability to be programmed to target RNA as well. Cas proteins from other CRISPR/Cas subtypes may also be exploited in this regard. Thus, CRISPR/Cas systems represent an effective and versatile biotechnological tool, which will have significant impact on future advancements in genome engineering. PMID:24323919

  18. Exploiting CRISPR/Cas systems for biotechnology.

    PubMed

    Sampson, Timothy R; Weiss, David S

    2014-01-01

    The Cas9 endonuclease is the central component of the Type II CRISPR/Cas system, a prokaryotic adaptive restriction system against invading nucleic acids, such as those originating from bacteriophages and plasmids. Recently, this RNA-directed DNA endonuclease has been harnessed to target DNA sequences of interest. Here, we review the development of Cas9 as an important tool to not only edit the genomes of a number of different prokaryotic and eukaryotic species, but also as an efficient system for site-specific transcriptional repression or activation. Additionally, a specific Cas9 protein has been observed to target an RNA substrate, suggesting that Cas9 may have the ability to be programmed to target RNA as well. Cas proteins from other CRISPR/Cas subtypes may also be exploited in this regard. Thus, CRISPR/Cas systems represent an effective and versatile biotechnological tool, which will have significant impact on future advancements in genome engineering. © 2014 WILEY Periodicals, Inc.

  19. From Content Knowledge to Community Change: A Review of Representations of Environmental Health Literacy

    PubMed Central

    Gray, Kathleen M.

    2018-01-01

    Environmental health literacy (EHL) is a relatively new framework for conceptualizing how people understand and use information about potentially harmful environmental exposures and their influence on health. As such, information on the characterization and measurement of EHL is limited. This review provides an overview of EHL as presented in peer-reviewed literature and aggregates studies based on whether they represent individual level EHL or community level EHL or both. A range of assessment tools has been used to measure EHL, with many studies relying on pre-/post-assessment; however, a broader suite of assessment tools may be needed to capture community-wide outcomes. This review also suggests that the definition of EHL should explicitly include community change or collective action as an important longer-term outcome and proposes a refinement of previous representations of EHL as a theoretical framework, to include self-efficacy. PMID:29518955

  20. A Psychometric Tool for a Virtual Reality Rehabilitation Approach for Dyslexia.

    PubMed

    Pedroli, Elisa; Padula, Patrizia; Guala, Andrea; Meardi, Maria Teresa; Riva, Giuseppe; Albani, Giovanni

    2017-01-01

    Dyslexia is a chronic problem that affects the life of subjects and often influences their life choices. The standard rehabilitation methods all use a classic paper and pencil training format but these exercises are boring and demanding for children who may have difficulty in completing the treatments. It is important to develop a new rehabilitation program that would help children in a funny and engaging way. A Wii-based game was developed to demonstrate that a short treatment with an action video game, rather than phonological or orthographic training, may improve the reading abilities in dyslexic children. According to the results, an approach using cues in the context of a virtual environment may represent a promising tool to improve attentional skills. On the other hand, our results do not demonstrate an immediate effect on reading performance, suggesting that a more prolonged protocol may be a future direction.

  1. Teaching astronomy and astrophysics at the Valencian International University (VIU): Application and use of Virtual Observatory tools

    NASA Astrophysics Data System (ADS)

    Diago, P. D.; Gutiérrez-Soto, J.; Ruiz, J. E.; Solano, E.

    2013-05-01

    The Astronomy and Astrophysics Master, running at the Valencian International University (VIU, http://www.viu.es) since march 2010, is a clear example of how development of infor- mation and communication technologies (ICTs) and new e-learning methods are changing the traditional distance learning. In the context of the European Space for Higher Edu- cation (ESHE) we present how the Virtual Observatory (VO) tools can be an important part in the Astronomy and Astrophysics teaching. The described tasks has been carried out during the last three courses. These tasks are representative of the state of the art in Astrophysics research. We attach a description and a learning results list of each one of the presented tasks. The tasks can be downloaded at the Spanish VO website: http://svo.cab.inta-csic.es/docs/index.php?pagename=Education/VOcases

  2. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  3. Positron Emission Tomography of the Heart

    DOE R&D Accomplishments Database

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.

    1979-01-01

    Positron emission computed tomography (PCT) represents an important new tool for the noninvasive evaluation and, more importantly, quantification of myocardial performance. Most currently available techniques permit assessment of only one aspect of cardiac function, i.e., myocardial perfusion by gamma scintillation camera imaging with Thallium-201 or left ventricular function by echocardiography or radionuclide angiocardiography. With PCT it may become possible to study all three major segments of myocardial performance, i.e., regional blood flow, mechanical function and, most importantly, myocardial metabolism. Each of these segments can either be evaluated separately or in combination. This report briefly describes the principles and technological advantages of the imaging device, reviews currently available radioactive tracers and how they can be employed for the assessment of flow, function and metabolism; and, lastly, discusses possible applications of PCT for the study of cardiac physiology or its potential role in the diagnosis of cardiac disease.

  4. Application of adenosine triphosphate-driven bioluminescence for quantification of plaque bacteria and assessment of oral hygiene in children.

    PubMed

    Fazilat, Shahram; Sauerwein, Rebecca; McLeod, Jennifer; Finlayson, Tyler; Adam, Emilia; Engle, John; Gagneja, Prashant; Maier, Tom; Machida, Curtis A

    2010-01-01

    Dentistry has undergone a shift in caries management toward prevention and improved oral hygiene and diagnosis. Caries prevention now represents one of the most important aspects of modern dental practice. The purpose of this cross-sectional study was to demonstrate the use of adenosine triphosphate- (ATP-) driven bioluminescence as an innovative tool for the rapid chairside enumeration of oral bacteria (including plague streptococci) and assessment of oral hygiene and caries risk. Thirty-three pediatric patients (7- to 12-year-old males and females) were examined, and plague specimens, in addition to stimulated saliva, were collected from representative teeth within each quadrant. Oral specimens (n=150 specimens) were assessed by plating on enriched and selective agars, to enumerate total bacteria and streptococci, and subjected to adenosine triphosphate- (ATP-) driven bioluminescence determinations using a luciferase-based assay system. Statistical correlations, linking ATP values to numbers of total bacteria, oral streptococci and mutans streptococci, yielded highly significant r values of 0.854, 0.840, and 0.796, respectively Our clinical data is consistent with the hypothesis that ATP measurements have a strong statistical association with bacterial number in plague and saliva specimens, including numbers for oral streptococci, and may be used as a potential assessment tool for oral hygiene and caries risk in children.

  5. Clone DB: an integrated NCBI resource for clone-associated data

    PubMed Central

    Schneider, Valerie A.; Chen, Hsiu-Chuan; Clausen, Cliff; Meric, Peter A.; Zhou, Zhigang; Bouk, Nathan; Husain, Nora; Maglott, Donna R.; Church, Deanna M.

    2013-01-01

    The National Center for Biotechnology Information (NCBI) Clone DB (http://www.ncbi.nlm.nih.gov/clone/) is an integrated resource providing information about and facilitating access to clones, which serve as valuable research reagents in many fields, including genome sequencing and variation analysis. Clone DB represents an expansion and replacement of the former NCBI Clone Registry and has records for genomic and cell-based libraries and clones representing more than 100 different eukaryotic taxa. Records provide details of library construction, associated sequences, map positions and information about resource distribution. Clone DB is indexed in the NCBI Entrez system and can be queried by fields that include organism, clone name, gene name and sequence identifier. Whenever possible, genomic clones are mapped to reference assemblies and their map positions provided in clone records. Clones mapping to specific genomic regions can also be searched for using the NCBI Clone Finder tool, which accepts queries based on sequence coordinates or features such as gene or transcript names. Clone DB makes reports of library, clone and placement data on its FTP site available for download. With Clone DB, users now have available to them a centralized resource that provides them with the tools they will need to make use of these important research reagents. PMID:23193260

  6. Optimizing clinical benefit with targeted treatment in mRCC: "Tumor growth rate" as an alternative clinical endpoint.

    PubMed

    Milella, Michele

    2016-06-01

    Tumor growth rate (TGR), usually defined as the ratio between the slope of tumor growth before the initiation of treatment and the slope of tumor growth during treatment, between the nadir and disease progression, is a measure of the rate at which tumor volume increases over time. In patients with metastatic renal cell carcinoma (mRCC), TGR has emerged as a reliable alternative parameter to allow a quantitative and dynamic evaluation of tumor response. This review presents evidence on the correlation between TGR and treatment outcomes and discusses the potential role of this tool within the treatment scenario of mRCC. Current evidence, albeit of retrospective nature, suggests that TGR might represent a useful tool to assess whether treatment is altering the course of the disease, and has shown to be significantly associated with progression-free survival and overall survival. Therefore, TGR may represent a valuable endpoint for clinical trials evaluating new molecularly targeted therapies. Most importantly, incorporation of TGR in the assessment of individual patients undergoing targeted therapies may help clinicians decide if a given agent is no longer able to control disease growth and whether continuing therapy beyond RECIST progression may still produce clinical benefit. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Methylotrophy in the thermophilic Bacillus methanolicus, basic insights and application for commodity production from methanol.

    PubMed

    Müller, Jonas E N; Heggeset, Tonje M B; Wendisch, Volker F; Vorholt, Julia A; Brautaset, Trygve

    2015-01-01

    Using methanol as an alternative non-food feedstock for biotechnological production offers several advantages in line with a methanol-based bioeconomy. The Gram-positive, facultative methylotrophic and thermophilic bacterium Bacillus methanolicus is one of the few described microbial candidates with a potential for the conversion of methanol to value-added products. Its capabilities of producing and secreting the commercially important amino acids L-glutamate and L-lysine to high concentrations at 50 °C have been demonstrated and make B. methanolicus a promising target to develop cell factories for industrial-scale production processes. B. methanolicus uses the ribulose monophosphate cycle for methanol assimilation and represents the first example of plasmid-dependent methylotrophy. Recent genome sequencing of two physiologically different wild-type B. methanolicus strains, MGA3 and PB1, accompanied with transcriptome and proteome analyses has generated fundamental new insight into the metabolism of the species. In addition, multiple key enzymes representing methylotrophic and biosynthetic pathways have been biochemically characterized. All this, together with establishment of improved tools for gene expression, has opened opportunities for systems-level metabolic engineering of B. methanolicus. Here, we summarize the current status of its metabolism and biochemistry, available genetic tools, and its potential use in respect to overproduction of amino acids.

  8. A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: the Study Design and Implementation Assessment Device (Study DIAD).

    PubMed

    Valentine, Jeffrey C; Cooper, Harris

    2008-06-01

    Assessments of studies meant to evaluate the effectiveness of interventions, programs, and policies can serve an important role in the interpretation of research results. However, evidence suggests that available quality assessment tools have poor measurement characteristics and can lead to opposing conclusions when applied to the same body of studies. These tools tend to (a) be insufficiently operational, (b) rely on arbitrary post-hoc decision rules, and (c) result in a single number to represent a multidimensional construct. In response to these limitations, a multilevel and hierarchical instrument was developed in consultation with a wide range of methodological and statistical experts. The instrument focuses on the operational details of studies and results in a profile of scores instead of a single score to represent study quality. A pilot test suggested that satisfactory between-judge agreement can be obtained using well-trained raters working in naturalistic conditions. Limitations of the instrument are discussed, but these are inherent in making decisions about study quality given incomplete reporting and in the absence of strong, contextually based information about the effects of design flaws on study outcomes. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  9. Communications Effects Server (CES) Model for Systems Engineering Research

    DTIC Science & Technology

    2012-01-31

    Visualization Tool Interface «logical» HLA Tool Interface «logical» DIS Tool Interface «logical» STK Tool Interface «module» Execution Kernels «logical...interoperate with STK when running simulations. GUI Components  Architect – The Architect represents the main network design and visualization ...interest» CES «block» Third Party Visualization Tool «block» Third Party Analysis Tool «block» Third Party Text Editor «block» HLA Tools Analyst User Army

  10. Ecological networks to unravel the routes to horizontal transposon transfers.

    PubMed

    Venner, Samuel; Miele, Vincent; Terzian, Christophe; Biémont, Christian; Daubin, Vincent; Feschotte, Cédric; Pontier, Dominique

    2017-02-01

    Transposable elements (TEs) represent the single largest component of numerous eukaryotic genomes, and their activity and dispersal constitute an important force fostering evolutionary innovation. The horizontal transfer of TEs (HTT) between eukaryotic species is a common and widespread phenomenon that has had a profound impact on TE dynamics and, consequently, on the evolutionary trajectory of many species' lineages. However, the mechanisms promoting HTT remain largely unknown. In this article, we argue that network theory combined with functional ecology provides a robust conceptual framework and tools to delineate how complex interactions between diverse organisms may act in synergy to promote HTTs.

  11. The GTTP Movement: Engaging young minds to the beauty of science and space exploration

    NASA Astrophysics Data System (ADS)

    Doran, Rosa

    2015-03-01

    The Galileo Teacher Training Program (GTTP) is a living legacy of IYA2009. As a cornerstone of this important moment in the history of Astronomy, GTTP has managed to name representatives in over 100 nations and reached over 15000 teachers at a global level. The model used so far ensures sustainability and a fast growing support network. The task at hand is to engage educators in the use of modern tools for science teaching. Building the classroom of tomorrow is a promising path to engage young minds to the beauty of science and space exploration.

  12. Clusters of genetic diseases in Brazil.

    PubMed

    Cardoso, Gabriela Costa; de Oliveira, Marcelo Zagonel; Paixão-Côrtes, Vanessa Rodrigues; Castilla, Eduardo Enrique; Schuler-Faccini, Lavínia

    2018-06-02

    The aim of this paper is to present a database of isolated communities (CENISO) with high prevalence of genetic disorders or congenital anomalies in Brazil. We used two strategies to identify such communities: (1) a systematic literature review and (2) a "rumor strategy" based on anecdotal accounts. All rumors and reports were validated in a stepwise process. The bibliographical search identified 34 rumors and 245 rumors through the rumor strategy, and 144 were confirmed. A database like this one presented here represents an important tool for the planning of health priorities for rare diseases in low- and middle-income countries with large populations.

  13. Priorities for future research into asthma diagnostic tools: A PAN-EU consensus exercise from the European asthma research innovation partnership (EARIP).

    PubMed

    Garcia-Marcos, L; Edwards, J; Kennington, E; Aurora, P; Baraldi, E; Carraro, S; Gappa, M; Louis, R; Moreno-Galdo, A; Peroni, D G; Pijnenburg, M; Priftis, K N; Sanchez-Solis, M; Schuster, A; Walker, S

    2018-02-01

    The diagnosis of asthma is currently based on clinical history, physical examination and lung function, and to date, there are no accurate objective tests either to confirm the diagnosis or to discriminate between different types of asthma. This consensus exercise reviews the state of the art in asthma diagnosis to identify opportunities for future investment based on the likelihood of their successful development, potential for widespread adoption and their perceived impact on asthma patients. Using a two-stage e-Delphi process and a summarizing workshop, a group of European asthma experts including health professionals, researchers, people with asthma and industry representatives ranked the potential impact of research investment in each technique or tool for asthma diagnosis and monitoring. After a systematic review of the literature, 21 statements were extracted and were subject of the two-stage Delphi process. Eleven statements were scored 3 or more and were further discussed and ranked in a face-to-face workshop. The three most important diagnostic/predictive tools ranked were as follows: "New biological markers of asthma (eg genomics, proteomics and metabolomics) as a tool for diagnosis and/or monitoring," "Prediction of future asthma in preschool children with reasonable accuracy" and "Tools to measure volatile organic compounds (VOCs) in exhaled breath." © 2018 John Wiley & Sons Ltd.

  14. Helping coaches apply the principles of representative learning design: validation of a tennis specific practice assessment tool.

    PubMed

    Krause, Lyndon; Farrow, Damian; Reid, Machar; Buszard, Tim; Pinder, Ross

    2018-06-01

    Representative Learning Design (RLD) is a framework for assessing the degree to which experimental or practice tasks simulate key aspects of specific performance environments (i.e. competition). The key premise being that when practice replicates the performance environment, skills are more likely to transfer. In applied situations, however, there is currently no simple or quick method for coaches to assess the key concepts of RLD (e.g. during on-court tasks). The aim of this study was to develop a tool for coaches to efficiently assess practice task design in tennis. A consensus-based tool was developed using a 4-round Delphi process with 10 academic and 13 tennis-coaching experts. Expert consensus was reached for the inclusion of seven items, each consisting of two sub-questions related to (i) the task goal and (ii) the relevance of the task to competition performance. The Representative Practice Assessment Tool (RPAT) is proposed for use in assessing and enhancing practice task designs in tennis to increase the functional coupling between information and movement, and to maximise the potential for skill transfer to competition contexts.

  15. A Safety Conundrum Illustrated: Logic, Mathematics, and Science Are Not Enough

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.; Collins, Kristine R.

    2010-01-01

    In an ideal world, conversations about whether a particular system is safe, or whether a particular method or tool enhances safety, would be emotion-free discussions concentrating on the level of safety required, available evidence, and coherent logical, mathematical, or scientific arguments based on that evidence. In the real world, discussions about safety are often not emotion-free. Political and economic arguments may play a bigger role than logical, mathematical, and scientific arguments, and psychological factors may be as important, or even more important, than purely technical factors. This paper illustrates the conundrum that can result from this clash of the ideal and the real by means of an imagined conversation among a collection of fictional characters representing various types of people who may be participating in a safety discussion.

  16. Origins of the ancient constellations: I. The Mesopotamian traditions

    NASA Astrophysics Data System (ADS)

    Rogers, J. H.

    1998-02-01

    In the sky-map of ancient Babylon, constellations had two different roles, and thus developed into two overlapping traditions. One set of constellations represented the gods and their symbols; the other set represented rustic activities and provided a farming calendar. Many constellations were shared by the two traditions, but in some regions of sky there were alternative divine and rustic figures. These figures developed in stages from ~3200 BC to ~500 BC. Of the divine set, the most important (although the last to be finalised) were the twelve zodiacal signs, plus several associated animals (the serpent, crow, eagle, and fish), which were all transmitted to the classical Greek sky-map that we still use today. Conversely, the rustic constellations of workers and tools and animals were not transmitted to the West. However, a few of them may have survived in Bedouin Arab sky-maps of the first millennium AD.

  17. Fungal phytotoxins with potential herbicidal activity: chemical and biological characterization.

    PubMed

    Cimmino, Alessio; Masi, Marco; Evidente, Marco; Superchi, Stefano; Evidente, Antonio

    2015-12-19

    Covering: 2007 to 2015 Fungal phytotoxins are secondary metabolites playing an important role in the induction of disease symptoms interfering with host plant physiological processes. Although fungal pathogens represent a heavy constraint for agrarian production and for forest and environmental heritage, they can also represent an ecofriendly alternative to manage weeds. Indeed, the phytotoxins produced by weed pathogenic fungi are an efficient tool to design natural, safe bioherbicides. Their use could avoid that of synthetic pesticides causing resistance in the host plants and the long term impact of residues in agricultural products with a risk to human and animal health. The isolation and structural and biological characterization of phytotoxins produced by pathogenic fungi for weeds, including parasitic plants, are described. Structure activity relationships and mode of action studies for some phytotoxins are also reported to elucidate the herbicide potential of these promising fungal metabolites.

  18. An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process

    NASA Astrophysics Data System (ADS)

    Nguyen, ThanhDat; Kifor, Claudiu Vasile

    2015-09-01

    DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.

  19. Individual-based modeling of ecological and evolutionary processes

    USGS Publications Warehouse

    DeAngelis, Donald L.; Mooij, Wolf M.

    2005-01-01

    Individual-based models (IBMs) allow the explicit inclusion of individual variation in greater detail than do classical differential-equation and difference-equation models. Inclusion of such variation is important for continued progress in ecological and evolutionary theory. We provide a conceptual basis for IBMs by describing five major types of individual variation in IBMs: spatial, ontogenetic, phenotypic, cognitive, and genetic. IBMs are now used in almost all subfields of ecology and evolutionary biology. We map those subfields and look more closely at selected key papers on fish recruitment, forest dynamics, sympatric speciation, metapopulation dynamics, maintenance of diversity, and species conservation. Theorists are currently divided on whether IBMs represent only a practical tool for extending classical theory to more complex situations, or whether individual-based theory represents a radically new research program. We feel that the tension between these two poles of thinking can be a source of creativity in ecology and evolutionary theory.

  20. NEURAL SUBSTRATES OF CUE-REACTIVITY: ASSOCIATION WITH TREATMENT OUTCOMES AND RELAPSE

    PubMed Central

    Courtney, Kelly E.; Schacht, Joseph P.; Hutchison, Kent; Roche, Daniel J.O.; Ray, Lara A.

    2016-01-01

    Given the strong evidence for neurological alterations at the basis of drug dependence, functional magnetic resonance imaging (fMRI) represents an important tool in the clinical neuroscience of addiction. fMRI cue-reactivity paradigms represent an ideal platform to probe the involvement of neurobiological pathways subserving the reward/motivation system in addiction and potentially offer a translational mechanism by which interventions and behavioral predictions can be tested. Thus, this review summarizes the research that has applied fMRI cue-reactivity paradigms to the study of adult substance use disorder treatment responses. Studies utilizing fMRI cue-reactivity paradigms for the prediction of relapse, and as a means to investigate psychosocial and pharmacological treatment effects on cue-elicited brain activation are presented within four primary categories of substances: alcohol, nicotine, cocaine, and opioids. Lastly, suggestions for how to leverage fMRI technology to advance addiction science and treatment development are provided. PMID:26435524

  1. Phylogeny predicts future habitat shifts due to climate change.

    PubMed

    Kuntner, Matjaž; Năpăruş, Magdalena; Li, Daiqin; Coddington, Jonathan A

    2014-01-01

    Taxa may respond differently to climatic changes, depending on phylogenetic or ecological effects, but studies that discern among these alternatives are scarce. Here, we use two species pairs from globally distributed spider clades, each pair representing two lifestyles (generalist, specialist) to test the relative importance of phylogeny versus ecology in predicted responses to climate change. We used a recent phylogenetic hypothesis for nephilid spiders to select four species from two genera (Nephilingis and Nephilengys) that match the above criteria, are fully allopatric but combined occupy all subtropical-tropical regions. Based on their records, we modeled each species niche spaces and predicted their ecological shifts 20, 40, 60, and 80 years into the future using customized GIS tools and projected climatic changes. Phylogeny better predicts the species current ecological preferences than do lifestyles. By 2080 all species face dramatic reductions in suitable habitat (54.8-77.1%) and adapt by moving towards higher altitudes and latitudes, although at different tempos. Phylogeny and life style explain simulated habitat shifts in altitude, but phylogeny is the sole best predictor of latitudinal shifts. Models incorporating phylogenetic relatedness are an important additional tool to predict accurately biotic responses to global change.

  2. Coastal risk forecast system

    NASA Astrophysics Data System (ADS)

    Sabino, André; Poseiro, Pedro; Rodrigues, Armanda; Reis, Maria Teresa; Fortes, Conceição J.; Reis, Rui; Araújo, João

    2018-03-01

    The run-up and overtopping by sea waves are two of the main processes that threaten coastal structures, leading to flooding, destruction of both property and the environment, and harm to people. To build early warning systems, the consequences and associated risks in the affected areas must be evaluated. It is also important to understand how these two types of spatial information integrate with sensor data sources and the risk assessment methodology. This paper describes the relationship between consequences and risk maps, their role in risk management and how the HIDRALERTA system integrates both aspects in its risk methodology. It describes a case study for Praia da Vitória Port, Terceira Island, Azores, Portugal, showing that the main innovations in this system are twofold: it represents the overtopping flow and consequent flooding, which are critical for coastal and port areas protected by maritime structures, and it works also as a risk assessment tool, extremely important for long-term planning and decision-making. Moreover, the implementation of the system considers possible known variability issues, enabling changes in its behaviour as needs arise. This system has the potential to become a useful tool for the management of coastal and port areas, due to its capacity to effectively issue warnings and assess risks.

  3. Coastal risk forecast system

    NASA Astrophysics Data System (ADS)

    Sabino, André; Poseiro, Pedro; Rodrigues, Armanda; Reis, Maria Teresa; Fortes, Conceição J.; Reis, Rui; Araújo, João

    2018-04-01

    The run-up and overtopping by sea waves are two of the main processes that threaten coastal structures, leading to flooding, destruction of both property and the environment, and harm to people. To build early warning systems, the consequences and associated risks in the affected areas must be evaluated. It is also important to understand how these two types of spatial information integrate with sensor data sources and the risk assessment methodology. This paper describes the relationship between consequences and risk maps, their role in risk management and how the HIDRALERTA system integrates both aspects in its risk methodology. It describes a case study for Praia da Vitória Port, Terceira Island, Azores, Portugal, showing that the main innovations in this system are twofold: it represents the overtopping flow and consequent flooding, which are critical for coastal and port areas protected by maritime structures, and it works also as a risk assessment tool, extremely important for long-term planning and decision-making. Moreover, the implementation of the system considers possible known variability issues, enabling changes in its behaviour as needs arise. This system has the potential to become a useful tool for the management of coastal and port areas, due to its capacity to effectively issue warnings and assess risks.

  4. Analyzing Study of Path loss Propagation Models in Wireless Communications at 0.8 GHz

    NASA Astrophysics Data System (ADS)

    Kadhim Hoomod, Haider; Al-Mejibli, Intisar; Issa Jabboory, Abbas

    2018-05-01

    The paths loss propagation model is an important tool in wireless network planning, allowing network planner to optimize the cell towers distribution and meet expected service level requirements. However, each type of path loss propagation model is designed to predict path loss in a particular environment that may be inaccurate in other different environment. In this research different propagation models (Hata Model, ICC-33 Model, Ericson Model and Coast-231 Model) have been analyzed and compared based on the measured data. The measured data represent signal strength of two cell towers placed in two different environments which obtained by a drive test of them. First one in AL-Habebea represents an urban environment (high-density region) and the second in AL-Hindea district represents a rural environment (low-density region) with operating frequency 0.8 GHz. The results of performing the analysis and comparison conclude that Hata model and Ericsson model shows small deviation from real measurements in urban environment and Hata model generally gives better prediction in the rural environment.

  5. Development and validation of the Vietnamese primary care assessment tool

    PubMed Central

    2018-01-01

    Objective To adapt the consumer version of the Primary Care Assessment Tool (PCAT) for Vietnam and determine its internal consistency and validity. Design A quantitative cross sectional study. Setting 56 communes in 3 representative provinces of central Vietnam. Participants Total of 3289 people who used health care services at health facility at least once over the past two years. Results The Vietnamese adult expanded consumer version of the PCAT (VN PCAT-AE) is an instrument for evaluation of primary care in Vietnam with 70 items comprising six scales representing four core primary care domains, and three additional scales representing three derivative domains. Sixteen other items from the original tool were not included in the final instrument, due to problems with missing values, floor or ceiling effects, and item-total correlations. All the retained scales have a Cronbach’s alpha above 0.70 except for the subscale of Family Centeredness. Conclusions The VN PCAT-AE demonstrates adequate internal consistency and validity to be used as an effective tool for measuring the quality of primary care in Vietnam from the consumer perspective. Additional work in the future to optimize valid measurement in all domains consistent with the original version of the tool may be helpful as the primary care system in Vietnam further develops. PMID:29324851

  6. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  7. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis.

    PubMed

    Bergeron, Mathieu; Lortie, Catherine L; Guitton, Matthieu J

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.

  8. Systematic review of surveillance systems and methods for early detection of exotic, new and re-emerging diseases in animal populations.

    PubMed

    Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M

    2015-07-01

    In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.

  9. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis

    PubMed Central

    Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies. PMID:26556560

  10. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  11. Strengths and weaknesses of temporal stability analysis for monitoring and estimating grid-mean soil moisture in a high-intensity irrigated agricultural landscape

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Li, Xin; Jin, Rui; Kang, Jian; Cosh, Michael H.

    2017-01-01

    Monitoring and estimating grid-mean soil moisture is very important for assessing many hydrological, biological, and biogeochemical processes and for validating remotely sensed surface soil moisture products. Temporal stability analysis (TSA) is a valuable tool for identifying a small number of representative sampling points to estimate the grid-mean soil moisture content. This analysis was evaluated and improved using high-quality surface soil moisture data that were acquired by a wireless sensor network in a high-intensity irrigated agricultural landscape in an arid region of northwestern China. The performance of the TSA was limited in areas where the representative error was dominated by random events, such as irrigation events. This shortcoming can be effectively mitigated by using a stratified TSA (STSA) method, proposed in this paper. In addition, the following methods were proposed for rapidly and efficiently identifying representative sampling points when using TSA. (1) Instantaneous measurements can be used to identify representative sampling points to some extent; however, the error resulting from this method is significant when validating remotely sensed soil moisture products. Thus, additional representative sampling points should be considered to reduce this error. (2) The calibration period can be determined from the time span of the full range of the grid-mean soil moisture content during the monitoring period. (3) The representative error is sensitive to the number of calibration sampling points, especially when only a few representative sampling points are used. Multiple sampling points are recommended to reduce data loss and improve the likelihood of representativeness at two scales.

  12. Green Tool

    EPA Pesticide Factsheets

    The Green Tool represents infiltration-based stormwater control practices. It allows modelers to select a BMP type, channel shape and BMP unit dimensions, outflow control devices, and infiltration method. The program generates an HSPF-formatted FTABLE.

  13. Evaluating the Effectiveness of Web-based Climate Resilience Decision Support Tools: Insights from Coastal New Jersey

    NASA Astrophysics Data System (ADS)

    Brady, M.; Lathrop, R.; Auermuller, L. M.; Leichenko, R.

    2016-12-01

    Despite the recent surge of Web-based decision support tools designed to promote resiliency in U.S. coastal communities, to-date there has been no systematic study of their effectiveness. This study demonstrates a method to evaluate important aspects of effectiveness of four Web map tools designed to promote consideration of climate risk information in local decision-making and planning used in coastal New Jersey. In summer 2015, the research team conducted in-depth phone interviews with users of one regulatory and three non-regulatory Web map tools using a semi-structured questionnaire. The interview and analysis design drew from a combination of effectiveness evaluation approaches developed in software and information usability, program evaluation, and management information system (MIS) research. Effectiveness assessment results were further analyzed and discussed in terms of conceptual hierarchy of system objectives defined by respective tool developer and user organizations represented in the study. Insights from the interviews suggest that users rely on Web tools as a supplement to desktop and analog map sources because they provide relevant and up-to-date information in a highly accessible and mobile format. The users also reported relying on multiple information sources and comparison between digital and analog sources for decision support. However, with respect to this decision support benefit, users were constrained by accessibility factors such as lack of awareness and training with some tools, lack of salient information such as planning time horizons associated with future flood scenarios, and environmental factors such as mandates restricting some users to regulatory tools. Perceptions of Web tool credibility seem favorable overall, but factors including system design imperfections and inconsistencies in data and information across platforms limited trust, highlighting a need for better coordination between tools. Contributions of the study include user feedback on web-tool system designs consistent with collaborative methods for enhancing usability and a systematic look at effectiveness that includes both user perspectives and consideration of developer and organizational objectives.

  14. Interactive visualization of public health indicators to support policymaking: An exploratory study

    PubMed Central

    Zakkar, Moutasem; Sedig, Kamran

    2017-01-01

    Purpose The purpose of this study is to examine the use of interactive visualizations to represent data/information related to social determinants of health and public health indicators, and to investigate the benefits of such visualizations for health policymaking. Methods: The study developed a prototype for an online interactive visualization tool that represents the social determinants of health. The study participants explored and used the tool. The tool was evaluated using the informal user experience evaluation method. This method involves the prospective users of a tool to use and play with it and their feedback to be collected through interviews. Results: Using visualizations to represent and interact with health indicators has advantages over traditional representation techniques that do not allow users to interact with the information. Communicating healthcare indicators to policymakers is a complex task because of the complexity of the indicators, diversity of audiences, and different audience needs. This complexity can lead to information misinterpretation, which occurs when users of the health data ignore or do not know why, where, and how the data has been produced, or where and how it can be used. Conclusions: Public health policymaking is a complex process, and data is only one element among others needed in this complex process. Researchers and healthcare organizations should conduct a strategic evaluation to assess the usability of interactive visualizations and decision support tools before investing in these tools. Such evaluation should take into consideration the cost, ease of use, learnability, and efficiency of those tools, and the factors that influence policymaking. PMID:29026455

  15. Heterogeneity in cervical spine assessment in paediatric trauma: A survey of physicians' knowledge and application at a paediatric major trauma centre.

    PubMed

    Buckland, Aaron J; Bressan, Silvia; Jowett, Helen; Johnson, Michael B; Teague, Warwick J

    2016-10-01

    Evidence-based decision-making tools are widely used to guide cervical spine assessment in adult trauma patients. Similar tools validated for use in injured children are lacking. A paediatric-specific approach is appropriate given important differences in cervical spine anatomy, mechanism of spinal injury and concerns over ionising radiation in children. The present study aims to survey physicians' knowledge and application of cervical spine assessment in injured children. A cross-sectional survey of physicians actively engaged in trauma care within a paediatric trauma centre was undertaken. Participation was voluntary and responses de-idenitified. The survey comprised 20 questions regarding initial assessment, imaging, immobilisation and perioperative management. Physicians' responses were compared with available current evidence. Sixty-seven physicians (28% registrars, 17% fellows and 55.2% consultants) participated. Physicians rated altered mental state, intoxication and distracting injury as the most important contraindications to cervical spine clearance in children. Fifty-four per cent considered adequate plain imaging to be 3-view cervical spine radiographs (anterior-posterior, lateral and odontoid), whereas 30% considered CT the most sensitive modality for detecting unstable cervical spine injuries. Physicians' responses reflected marked heterogeneity regarding semi-rigid cervical collars and what constitutes cervical spine 'clearance'. Greater consensus existed for perioperative precautions in this setting. Physicians actively engaged in paediatric trauma care demonstrate marked heterogeneity in their knowledge and application of cervical spine assessment. This is compounded by a lack of paediatric-specific evidence and definitions, involvement of multiple specialties and staff turnover within busy departments. A validated decision-making tool for cervical spine assessment will represent an important advance in paediatric trauma. © 2016 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  16. Use of 16S rRNA gene for identification of a broad range of clinically relevant bacterial pathogens

    DOE PAGES

    Srinivasan, Ramya; Karaoz, Ulas; Volegova, Marina; ...

    2015-02-06

    According to World Health Organization statistics of 2011, infectious diseases remain in the top five causes of mortality worldwide. However, despite sophisticated research tools for microbial detection, rapid and accurate molecular diagnostics for identification of infection in humans have not been extensively adopted. Time-consuming culture-based methods remain to the forefront of clinical microbial detection. The 16S rRNA gene, a molecular marker for identification of bacterial species, is ubiquitous to members of this domain and, thanks to ever-expanding databases of sequence information, a useful tool for bacterial identification. In this study, we assembled an extensive repository of clinical isolates (n =more » 617), representing 30 medically important pathogenic species and originally identified using traditional culture-based or non-16S molecular methods. This strain repository was used to systematically evaluate the ability of 16S rRNA for species level identification. To enable the most accurate species level classification based on the paucity of sequence data accumulated in public databases, we built a Naïve Bayes classifier representing a diverse set of high-quality sequences from medically important bacterial organisms. We show that for species identification, a model-based approach is superior to an alignment based method. Overall, between 16S gene based and clinical identities, our study shows a genus-level concordance rate of 96% and a species-level concordance rate of 87.5%. We point to multiple cases of probable clinical misidentification with traditional culture based identification across a wide range of gram-negative rods and gram-positive cocci as well as common gram-negative cocci.« less

  17. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  18. Uncertainty Analysis of Coupled Socioeconomic-Cropping Models: Building Confidence in Climate Change Decision-Support Tools for Local Stakeholders

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.

    2015-12-01

    While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.

  19. State-Targeted Funding and Technical Assistance to Increase Access to Medication Treatment for Opioid Use Disorder.

    PubMed

    Abraham, Amanda J; Andrews, Christina M; Grogan, Colleen M; Pollack, Harold A; D'Aunno, Thomas; Humphreys, Keith; Friedmann, Peter D

    2018-04-01

    As the United States grapples with an opioid epidemic, expanding access to effective treatment for opioid use disorder is a major public health priority. Identifying effective policy tools that can be used to expand access to care is critically important. This article examines the relationship between state-targeted funding and technical assistance and adoption of three medications for treating opioid use disorder: oral naltrexone, injectable naltrexone, and buprenorphine. This study draws from the 2013-2014 wave of the National Drug Abuse Treatment System Survey, a nationally representative, longitudinal study of substance use disorder treatment programs. The sample includes data from 695 treatment programs (85.5% response rate) and representatives from single-state agencies in 49 states and Washington, D.C. (98% response rate). Logistic regression was used to examine the relationships of single-state agency targeted funding and technical assistance to availability of opioid use disorder medications among treatment programs. State-targeted funding was associated with increased program-level adoption of oral naltrexone (adjusted odds ratio [AOR]=3.14, 95% confidence interval [CI]=1.49-6.60, p=.004) and buprenorphine (AOR=2.47, 95% CI=1.31-4.67, p=.006). Buprenorphine adoption was also correlated with state technical assistance to support medication provision (AOR=1.18, 95% CI=1.00-1.39, p=.049). State-targeted funding for medications may be a viable policy lever for increasing access to opioid use disorder medications. Given the historically low rates of opioid use disorder medication adoption in treatment programs, single-state agency targeted funding is a potentially important tool to reduce mortality and morbidity associated with opioid disorders and misuse.

  20. Use of 16S rRNA Gene for Identification of a Broad Range of Clinically Relevant Bacterial Pathogens

    PubMed Central

    Srinivasan, Ramya; Karaoz, Ulas; Volegova, Marina; MacKichan, Joanna; Kato-Maeda, Midori; Miller, Steve; Nadarajan, Rohan; Brodie, Eoin L.; Lynch, Susan V.

    2015-01-01

    According to World Health Organization statistics of 2011, infectious diseases remain in the top five causes of mortality worldwide. However, despite sophisticated research tools for microbial detection, rapid and accurate molecular diagnostics for identification of infection in humans have not been extensively adopted. Time-consuming culture-based methods remain to the forefront of clinical microbial detection. The 16S rRNA gene, a molecular marker for identification of bacterial species, is ubiquitous to members of this domain and, thanks to ever-expanding databases of sequence information, a useful tool for bacterial identification. In this study, we assembled an extensive repository of clinical isolates (n = 617), representing 30 medically important pathogenic species and originally identified using traditional culture-based or non-16S molecular methods. This strain repository was used to systematically evaluate the ability of 16S rRNA for species level identification. To enable the most accurate species level classification based on the paucity of sequence data accumulated in public databases, we built a Naïve Bayes classifier representing a diverse set of high-quality sequences from medically important bacterial organisms. We show that for species identification, a model-based approach is superior to an alignment based method. Overall, between 16S gene based and clinical identities, our study shows a genus-level concordance rate of 96% and a species-level concordance rate of 87.5%. We point to multiple cases of probable clinical misidentification with traditional culture based identification across a wide range of gram-negative rods and gram-positive cocci as well as common gram-negative cocci. PMID:25658760

  1. The Peach v2.0 release: high-resolution linkage mapping and deep resequencing improve chromosome-scale assembly and contiguity.

    PubMed

    Verde, Ignazio; Jenkins, Jerry; Dondini, Luca; Micali, Sabrina; Pagliarani, Giulia; Vendramin, Elisa; Paris, Roberta; Aramini, Valeria; Gazza, Laura; Rossini, Laura; Bassi, Daniele; Troggio, Michela; Shu, Shengqiang; Grimwood, Jane; Tartarini, Stefano; Dettori, Maria Teresa; Schmutz, Jeremy

    2017-03-11

    The availability of the peach genome sequence has fostered relevant research in peach and related Prunus species enabling the identification of genes underlying important horticultural traits as well as the development of advanced tools for genetic and genomic analyses. The first release of the peach genome (Peach v1.0) represented a high-quality WGS (Whole Genome Shotgun) chromosome-scale assembly with high contiguity (contig L50 214.2 kb), large portions of mapped sequences (96%) and high base accuracy (99.96%). The aim of this work was to improve the quality of the first assembly by increasing the portion of mapped and oriented sequences, correcting misassemblies and improving the contiguity and base accuracy using high-throughput linkage mapping and deep resequencing approaches. Four linkage maps with 3,576 molecular markers were used to improve the portion of mapped and oriented sequences (from 96.0% and 85.6% of Peach v1.0 to 99.2% and 98.2% of v2.0, respectively) and enabled a more detailed identification of discernible misassemblies (10.4 Mb in total). The deep resequencing approach fixed 859 homozygous SNPs (Single Nucleotide Polymorphisms) and 1347 homozygous indels. Moreover, the assembled NGS contigs enabled the closing of 212 gaps with an improvement in the contig L50 of 19.2%. The improved high quality peach genome assembly (Peach v2.0) represents a valuable tool for the analysis of the genetic diversity, domestication, and as a vehicle for genetic improvement of peach and related Prunus species. Moreover, the important phylogenetic position of peach and the absence of recent whole genome duplication (WGD) events make peach a pivotal species for comparative genomics studies aiming at elucidating plant speciation and diversification processes.

  2. Virtual Guidance Ultrasound: A Tool to Obtain Diagnostic Ultrasound for Remote Environments

    NASA Technical Reports Server (NTRS)

    Caine,Timothy L.; Martin David S.; Matz, Timothy; Lee, Stuart M. C.; Stenger, Michael B.; Platts, Steven H.

    2012-01-01

    Astronauts currently acquire ultrasound images on the International Space Station with the assistance of real-time remote guidance from an ultrasound expert in Mission Control. Remote guidance will not be feasible when significant communication delays exist during exploration missions beyond low-Earth orbit. For example, there may be as much as a 20- minute delay in communications between the Earth and Mars. Virtual-guidance, a pre-recorded audio-visual tutorial viewed in real-time, is a viable modality for minimally trained scanners to obtain diagnostically-adequate images of clinically relevant anatomical structures in an autonomous manner. METHODS: Inexperienced ultrasound operators were recruited to perform carotid artery (n = 10) and ophthalmic (n = 9) ultrasound examinations using virtual guidance as their only instructional tool. In the carotid group, each each untrained operator acquired two-dimensional, pulsed, and color Doppler of the carotid artery. In the ophthalmic group, operators acquired representative images of the anterior chamber of the eye, retina, optic nerve, and nerve sheath. Ultrasound image quality was evaluated by independent imaging experts. RESULTS: Eight of the 10 carotid studies were judged to be diagnostically adequate. With one exception the quality of all the ophthalmic images were adequate to excellent. CONCLUSION: Diagnostically-adequate carotid and ophthalmic ultrasound examinations can be obtained by untrained operators with instruction only from an audio/video tutorial viewed in real time while scanning. This form of quick-response-guidance, can be developed for other ultrasound examinations, represents an opportunity to acquire important medical and scientific information for NASA flight surgeons and researchers when trained medical personnel are not present. Further, virtual guidance will allow untrained personnel to autonomously obtain important medical information in remote locations on Earth where communication is difficult or absent.

  3. Advances in Landslide Nowcasting: Evaluation of a Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia Bach; Peters-Lidard, Christa; Adler, Robert; Hong, Yang; Kumar, Sujay; Lerner-Lam, Arthur

    2011-01-01

    The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that may experience landslide activity. This system combines a calculation of static landslide susceptibility with satellite-derived rainfall estimates and uses a threshold approach to generate a set of nowcasts that classify potentially hazardous areas. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale near real-time landslide hazard assessment efforts, it requires several modifications before it can be fully realized as an operational tool. This study draws upon a prior work s recommendations to develop a new approach for considering landslide susceptibility and hazard at the regional scale. This case study calculates a regional susceptibility map using remotely sensed and in situ information and a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America. The susceptibility map is evaluated with a regional rainfall intensity duration triggering threshold and results are compared with the global algorithm framework for the same event. Evaluation of this regional system suggests that this empirically based approach provides one plausible way to approach some of the data and resolution issues identified in the global assessment. The presented methodology is straightforward to implement, improves upon the global approach, and allows for results to be transferable between regions. The results also highlight several remaining challenges, including the empirical nature of the algorithm framework and adequate information for algorithm validation. Conclusions suggest that integrating additional triggering factors such as soil moisture may help to improve algorithm performance accuracy. The regional algorithm scenario represents an important step forward in advancing regional and global-scale landslide hazard assessment.

  4. Risk Reduction and Training using Simulation Based Tools - 12180

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Irin P.

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less

  5. A Hybrid Approach to Data Assimilation for Reconstructing the Evolution of Mantle Dynamics

    NASA Astrophysics Data System (ADS)

    Zhou, Quan; Liu, Lijun

    2017-11-01

    Quantifying past mantle dynamic processes represents a major challenge in understanding the temporal evolution of the solid earth. Mantle convection modeling with data assimilation is one of the most powerful tools to investigate the dynamics of plate subduction and mantle convection. Although various data assimilation methods, both forward and inverse, have been created, these methods all have limitations in their capabilities to represent the real earth. Pure forward models tend to miss important mantle structures due to the incorrect initial condition and thus may lead to incorrect mantle evolution. In contrast, pure tomography-based models cannot effectively resolve the fine slab structure and would fail to predict important subduction-zone dynamic processes. Here we propose a hybrid data assimilation approach that combines the unique power of the sequential and adjoint algorithms, which can properly capture the detailed evolution of the downgoing slab and the tomographically constrained mantle structures, respectively. We apply this new method to reconstructing mantle dynamics below the western U.S. while considering large lateral viscosity variations. By comparing this result with those from several existing data assimilation methods, we demonstrate that the hybrid modeling approach recovers the realistic 4-D mantle dynamics the best.

  6. Safety of human papillomavirus vaccines: a review.

    PubMed

    Stillo, Michela; Carrillo Santisteve, Paloma; Lopalco, Pier Luigi

    2015-05-01

    Between 2006 and 2009, two different human papillomavirus virus (HPV) vaccines were licensed for use: a quadrivalent (qHPVv) and a bivalent (bHPVv) vaccine. Since 2008, HPV vaccination programmes have been implemented in the majority of the industrialized countries. Since 2013, HPV vaccination has been part of the national programs of 66 countries including almost all countries in North America and Western Europe. Despite all the efforts made by individual countries, coverage rates are lower than expected. Vaccine safety represents one of the main concerns associated with the lack of acceptance of HPV vaccination both in the European Union/European Economic Area and elsewhere. Safety data published on bivalent and quadrivalent HPV vaccines, both in pre-licensure and post-licensure phase, are reviewed. Based on the latest scientific evidence, both HPV vaccines seem to be safe. Nevertheless, public concern and rumors about adverse events (AE) represent an important barrier to overcome in order to increase vaccine coverage. Passive surveillance of AEs is an important tool for detecting safety signals, but it should be complemented by activities aimed at assessing the real cause of all suspect AEs. Improved vaccine safety surveillance is the first step for effective communication based on scientific evidence.

  7. Analysis of the production and transaction costs of forest carbon offset projects in the USA.

    PubMed

    Galik, Christopher S; Cooley, David M; Baker, Justin S

    2012-12-15

    Forest carbon offset project implementation costs, comprised of both production and transaction costs, could present an important barrier to private landowner participation in carbon offset markets. These costs likewise represent a largely undocumented component of forest carbon offset potential. Using a custom spreadsheet model and accounting tool, this study examines the implementation costs of different forest offset project types operating in different forest types under different accounting and sampling methodologies. Sensitivity results are summarized concisely through response surface regression analysis to illustrate the relative effect of project-specific variables on total implementation costs. Results suggest that transaction costs may represent a relatively small percentage of total project implementation costs - generally less than 25% of the total. Results also show that carbon accounting methods, specifically the method used to establish project baseline, may be among the most important factors in driving implementation costs on a per-ton-of-carbon-sequestered basis, dramatically increasing variability in both transaction and production costs. This suggests that accounting could be a large driver in the financial viability of forest offset projects, with transaction costs likely being of largest concern to those projects at the margin. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. A Hybrid Forward-Adjoint Data Assimilation Method for Reconstructing the Temporal Evolution of Mantle Dynamics

    NASA Astrophysics Data System (ADS)

    Zhou, Q.; Liu, L.

    2017-12-01

    Quantifying past mantle dynamic processes represents a major challenge in understanding the temporal evolution of the solid earth. Mantle convection modeling with data assimilation is one of the most powerful tools to investigate the dynamics of plate subduction and mantle convection. Although various data assimilation methods, both forward and inverse, have been created, these methods all have limitations in their capabilities to represent the real earth. Pure forward models tend to miss important mantle structures due to the incorrect initial condition and thus may lead to incorrect mantle evolution. In contrast, pure tomography-based models cannot effectively resolve the fine slab structure and would fail to predict important subduction-zone dynamic processes. Here we propose a hybrid data assimilation method that combines the unique power of the sequential and adjoint algorithms, which can properly capture the detailed evolution of the downgoing slab and the tomographically constrained mantle structures, respectively. We apply this new method to reconstructing mantle dynamics below the western U.S. while considering large lateral viscosity variations. By comparing this result with those from several existing data assimilation methods, we demonstrate that the hybrid modeling approach recovers the realistic 4-D mantle dynamics to the best.

  9. Assessing research activity and capacity of community-based organizations: development and pilot testing of an instrument.

    PubMed

    Humphries, Debbie L; Carroll-Scott, Amy; Mitchell, Leif; Tian, Terry; Choudhury, Shonali; Fiellin, David A

    2014-01-01

    Although awareness of the importance of the research capacity of community-based organizations (CBOs) is growing, a uniform framework of the research capacity domains within CBOs has not yet been developed. To develop a framework and instrument (the Community REsearch Activity assessment Tool [CREAT]) for assessing the research activity and capacity of CBOs that incorporates awareness of the different data collection and analysis priorities of CBOs. We conducted a review of existing tools for assessing research capacity to identify key capacity domains. Instrument items were developed through an iterative process with CBO representatives and community researchers. The CREAT was then pilot tested with 30 CBOs. The four primary domains of the CREAT framework include 1) organizational support for research, 2) generalizable experiences, 3) research specific experiences, and 4) funding. Organizations reported a high prevalence of activities in the research-specific experiences domain, including conducting literature reviews (70%), use of research terminology (83%), and primary data collection (100%). Respondents see research findings as important to improve program and service delivery, and to seek funds for new programs and services. Funders, board members, and policymakers are the most important dissemination audiences. The work reported herein advances the field of CBO research capacity by developing a systematic framework for assessing research activity and capacity relevant to the work of CBOs, and by developing and piloting an instrument to assess activity in these domains.

  10. Creating and parameterizing patient-specific deep brain stimulation pathway-activation models using the hyperdirect pathway as an example

    PubMed Central

    Gunalan, Kabilar; Chaturvedi, Ashutosh; Howell, Bryan; Duchin, Yuval; Lempka, Scott F.; Patriat, Remi; Sapiro, Guillermo; Harel, Noam; McIntyre, Cameron C.

    2017-01-01

    Background Deep brain stimulation (DBS) is an established clinical therapy and computational models have played an important role in advancing the technology. Patient-specific DBS models are now common tools in both academic and industrial research, as well as clinical software systems. However, the exact methodology for creating patient-specific DBS models can vary substantially and important technical details are often missing from published reports. Objective Provide a detailed description of the assembly workflow and parameterization of a patient-specific DBS pathway-activation model (PAM) and predict the response of the hyperdirect pathway to clinical stimulation. Methods Integration of multiple software tools (e.g. COMSOL, MATLAB, FSL, NEURON, Python) enables the creation and visualization of a DBS PAM. An example DBS PAM was developed using 7T magnetic resonance imaging data from a single unilaterally implanted patient with Parkinson’s disease (PD). This detailed description implements our best computational practices and most elaborate parameterization steps, as defined from over a decade of technical evolution. Results Pathway recruitment curves and strength-duration relationships highlight the non-linear response of axons to changes in the DBS parameter settings. Conclusion Parameterization of patient-specific DBS models can be highly detailed and constrained, thereby providing confidence in the simulation predictions, but at the expense of time demanding technical implementation steps. DBS PAMs represent new tools for investigating possible correlations between brain pathway activation patterns and clinical symptom modulation. PMID:28441410

  11. The Zebrafish Xenograft Platform: Evolution of a Novel Cancer Model and Preclinical Screening Tool.

    PubMed

    Wertman, Jaime; Veinotte, Chansey J; Dellaire, Graham; Berman, Jason N

    2016-01-01

    Animal xenografts of human cancers represent a key preclinical tool in the field of cancer research. While mouse xenografts have long been the gold standard, investigators have begun to use zebrafish (Danio rerio) xenotransplantation as a relatively rapid, robust and cost-effective in vivo model of human cancers. There are several important methodological considerations in the design of an informative and efficient zebrafish xenotransplantation experiment. Various transgenic fish strains have been created that facilitate microscopic observation, ranging from the completely transparent casper fish to the Tg(fli1:eGFP) fish that expresses fluorescent GFP protein in its vascular tissue. While human cancer cell lines have been used extensively in zebrafish xenotransplantation studies, several reports have also used primary patient samples as the donor material. The zebrafish is ideally suited for transplanting primary patient material by virtue of the relatively low number of cells required for each embryo (between 50 and 300 cells), the absence of an adaptive immune system in the early zebrafish embryo, and the short experimental timeframe (5-7 days). Following xenotransplantation into the fish, cells can be tracked using in vivo or ex vivo measures of cell proliferation and migration, facilitated by fluorescence or human-specific protein expression. Importantly, assays have been developed that allow for the reliable detection of in vivo human cancer cell growth or inhibition following administration of drugs of interest. The zebrafish xenotransplantation model is a unique and effective tool for the study of cancer cell biology.

  12. Triad Issue Paper: Using Geophysical Tools to Develop the Conceptual Site Model

    EPA Pesticide Factsheets

    This technology bulletin explains how hazardous-waste site professionals can use geophysical tools to provide information about subsurface conditions to create a more representative conceptual site model (CSM).

  13. Risk Importance Measures in the Designand Operation of Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrbanic I.; Samanta P.; Basic, I

    This monograph presents and discusses risk importance measures as quantified by the probabilistic risk assessment (PRA) models of nuclear power plants (NPPs) developed according to the current standards and practices. Usually, PRA tools calculate risk importance measures related to a single ?basic event? representing particular failure mode. This is, then, reflected in many current PRA applications. The monograph focuses on the concept of ?component-level? importance measures that take into account different failure modes of the component including common-cause failures (CCFs). In opening sections the roleof risk assessment in safety analysis of an NPP is introduced and discussion given of ?traditional?,more » mainly deterministic, design principles which have been established to assign a level of importance to a particular system, structure or component. This is followed by an overview of main risk importance measures for risk increase and risk decrease from current PRAs. Basic relations which exist among the measures are shown. Some of the current practical applications of risk importancemeasures from the field of NPP design, operation and regulation are discussed. The core of the monograph provides a discussion on theoreticalbackground and practical aspects of main risk importance measures at the level of ?component? as modeled in a PRA, starting from the simplest case, single basic event, and going toward more complexcases with multiple basic events and involvements in CCF groups. The intent is to express the component-level importance measures via theimportance measures and probabilities of the underlying single basic events, which are the inputs readily available from a PRA model andits results. Formulas are derived and discussed for some typical cases. The formulas and their results are demonstrated through some practicalexamples, done by means of a simplified PRA model developed in and run by RiskSpectrum? tool, which are presented in the appendices. The monograph concludes with discussion of limitations of the use of risk importance measures and a summary of component-level importance cases evaluated.« less

  14. The role of syntax in complex networks: Local and global importance of verbs in a syntactic dependency network

    NASA Astrophysics Data System (ADS)

    Čech, Radek; Mačutek, Ján; Žabokrtský, Zdeněk

    2011-10-01

    Syntax of natural language has been the focus of linguistics for decades. The complex network theory, being one of new research tools, opens new perspectives on syntax properties of the language. Despite numerous partial achievements, some fundamental problems remain unsolved. Specifically, although statistical properties typical for complex networks can be observed in all syntactic networks, the impact of syntax itself on these properties is still unclear. The aim of the present study is to shed more light on the role of syntax in the syntactic network structure. In particular, we concentrate on the impact of the syntactic function of a verb in the sentence on the complex network structure. Verbs play the decisive role in the sentence structure (“local” importance). From this fact we hypothesize the importance of verbs in the complex network (“global” importance). The importance of verb in the complex network is assessed by the number of links which are directed from the node representing verb to other nodes in the network. Six languages (Catalan, Czech, Dutch, Hungarian, Italian, Portuguese) were used for testing the hypothesis.

  15. Monitoring an Online Course with the GISMO Tool: A Case Study

    ERIC Educational Resources Information Center

    Mazza, Riccardo; Botturi, Luca

    2007-01-01

    This article presents GISMO, a novel, open source, graphic student-tracking tool integrated into Moodle. GISMO represents a further step in information visualization applied to education, and also a novelty in the field of learning management systems applications. The visualizations of the tool, its uses and the benefits it can bring are…

  16. Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation

    PubMed Central

    Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee

    2018-01-01

    This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964

  17. Representing clinical guidelines in UMl: a comparative study.

    PubMed

    Hederman, Lucy; Smutek, Daniel; Wade, Vincent; Knape, Thomas

    2002-01-01

    Clinical guidelines can be represented using models, such as GLIF, specifically designed for healthcare guidelines. This paper demonstrates that they can also be modelled using a mainstream business modelling language such as UML. The paper presents a guideline in GLIF and as UML activity diagrams, and then presents a mapping of GLIF primitives to UML. The potential benefits of using a mainstream modelling language are outlined. These include availability of advanced modelling tools, transfer between modelling tools, and automation via business workflow technology.

  18. Psychometric validation of the Chinese version of the Johns Hopkins Fall Risk Assessment Tool for older Chinese inpatients.

    PubMed

    Zhang, Junhong; Wang, Min; Liu, Yu

    2016-10-01

    To culturally adapt and evaluate the reliability and validity of the Chinese version of the Johns Hopkins Fall Risk Assessment Tool among older inpatients in the mainland of China. Patient falls are an important safety consideration within hospitals among older inpatients. Nurses need specific risk assessment tools for older inpatients to reliably identify at-risk populations and guide interventions that highlight fixable risk factors for falls and consequent injuries. In China, a few tools have been developed to measure fall risk. However, they lack the solid psychometric development necessary to establish their validity and reliability, and they are not widely used for elderly inpatients. A cross-sectional study. A convenient sampling was used to recruit 201 older inpatients from two tertiary-level hospitals in Beijing and Xiamen, China. The Johns Hopkins Fall Risk Assessment Tool was translated using forward and backward translation procedures and was administered to these 201 older inpatients. Reliability of the tool was calculated by inter-rater reliability and Cronbach's alpha. Validity was analysed through content validity index and construct validity. The Inter-rater reliability of Chinese version of Johns Hopkins Fall Risk Assessment Tool was 97·14% agreement with Cohen's Kappa of 0·903. Cronbach's α was 0·703. Content of Validity Index was 0·833. Two factors represented intrinsic and extrinsic risk factors were explored that together explained 58·89% of the variance. This study provided evidence that Johns Hopkins Fall Risk Assessment Tool is an acceptable, valid and reliable tool to identify older inpatients at risk of falls and falls with injury. Further psychometric testing on criterion validity and evaluation of its advanced utility in geriatric clinical settings are warranted. The Chinese version of Johns Hopkins Fall Risk Assessment Tool may be useful for health care personnel to identify older Chinese inpatients at risk of falls and falls with injury. © 2016 John Wiley & Sons Ltd.

  19. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    NASA Astrophysics Data System (ADS)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.

  20. Red marrow and blood dosimetry in 131I treatment of metastatic thyroid carcinoma: pre-treatment versus in-therapy results

    NASA Astrophysics Data System (ADS)

    Giostra, A.; Richetta, E.; Pasquino, M.; Miranti, A.; Cutaia, C.; Brusasco, G.; Pellerito, R. E.; Stasi, M.

    2016-06-01

    Treatment with radioiodine is a standard procedure for patients with well-differentiated thyroid cancer, but the main approach to the therapy is still empiric, consisting of the administration of fixed activities. A predictive individualized dosimetric study may represent an important tool for physicians to determine the best activity to prescribe. The aim of this work is to compare red marrow and blood absorbed dose values obtained in the pre-treatment (PT) dosimetry phase with those obtained in the in-treatment (IT) dosimetry phase in order to estimate the predictive power of PT trial doses and to determine if they can be used as a decision-making tool to safely administer higher 131I activity to potentially increase the efficacy of treatment. The PT and IT dosimetry for 50 patients has been evaluated using three different dosimetric approaches. In all three approaches blood and red marrow doses, are calculated as the sum of two components, the dose from 131I activity in the blood and the dose from 131I activity located in the remainder of the body (i.e. the blood and whole-body contributions to the total dose). PT and IT dose values to blood and red marrow appear to be well correlated irrespective of the dosimetric approach used. Linear regression analyses of PT and IT total doses, for blood and red marrow, and the whole-body contribution to these doses, showed consistent best fit slope and correlation coefficient values of approximately 0.9 and 0.6, respectively: analyses of the blood dose contribution to the total doses also yielded similar values for the best fit slope but with correlation coefficient values of approximately 0.4 reflecting the greater variance in these dose estimates. These findings suggest that pre-treatment red marrow dose assessments may represent an important tool to personalize metastatic thyroid cancer treatment, removing the constraints of a fixed activity approach and permitting potentially more effective higher 131I activities to be safely used in-treatment.

  1. The use of tools for learning science in small groups

    NASA Astrophysics Data System (ADS)

    Valdes, Rosa Maria

    2000-10-01

    "Hands-on" learning through the use of tools or manipulatives representative of science concepts has long been an important component of the middle school science curriculum. However, scarce research exists on the impact of tool use on learning of science concepts, particularly on the processes involved in such learning. This study investigated how the use of tools by students engaged in small group discussion about the concept of electrical resistance and the explanations that accompany such use leads to improved understandings of the concept. Specifically, the main hypothesis of the study was that students who observe explanations by their high-ability peers accompanied by accurate tool use and who are highly engaged in these explanations would show learning gains. Videotaped interactions of students working in small groups to solve tasks on electricity were coded using scales that measured the accuracy of the tool use, the accuracy of the explanations presented, and the level of engagement of target students. The data of 48 students whose knowledge of the concept of resistance was initially low and who also were determined to be low achievers as shown by their scores on a set of pretest, was analyzed. Quantitative and qualitative analyses showed that students who observed their peers give explanations using tools and who were engaged at least moderately made gains in their understandings of resistance. Specifically, the results of regression analyses showed that both the level of accuracy of a high-ability peer's explanation and the target student's level of engagement in the explanation significantly predicted target students' outcome scores. The number of presentations offered by a high-ability peer also significantly predicted outcome scores. Case study analyses of six students found that students who improved their scores the most from pretest to posttest had high-ability peers who tended to be verbal and who gave numerous explanations, whereas students who improved the least had high-ability peers who gave no explanations at all. Important implications of this study for teaching are that (1) teachers should group students heterogeneously and should monitor students' small groups to insure that students are producing content-oriented discussion, and (2) students should be allowed to manipulate tools that allow experimentation as students build understandings and promote communication of abstract ideas.

  2. Extrusomes in ciliates: diversification, distribution, and phylogenetic implications.

    PubMed

    Rosati, Giovanna; Modeo, Letizia

    2003-01-01

    Exocytosis is, in all likelihood, an important communication method among microbes. Ciliates are highly differentiated and specialized micro-organisms for which versatile and/or sophisticated exocytotic organelles may represent important adaptive tools. Thus, in ciliates, we find a broad range of different extrusomes, i.e ejectable membrane-bound organelles. Structurally simple extrusomes, like mucocysts and cortical granules, are widespread in different taxa within the phylum. They play the roles in each case required for the ecological needs of the organisms. Then, we find a number of more elaborate extrusomes, whose distribution within the phylum is more limited, and in some way related to phylogenetic affinities. Herein we provide a survey of literature and our data on selected extrusomes in ciliates. Their morphology, distribution, and possible function are discussed. The possible phylogenetic implications of their diversity are considered.

  3. A new software tool for 3D motion analyses of the musculo-skeletal system.

    PubMed

    Leardini, A; Belvedere, C; Astolfi, L; Fantozzi, S; Viceconti, M; Taddei, F; Ensini, A; Benedetti, M G; Catani, F

    2006-10-01

    Many clinical and biomechanical research studies, particularly in orthopaedics, nowadays involve forms of movement analysis. Gait analysis, video-fluoroscopy of joint replacement, pre-operative planning, surgical navigation, and standard radiostereometry would require tools for easy access to three-dimensional graphical representations of rigid segment motion. Relevant data from this variety of sources need to be organised in structured forms. Registration, integration, and synchronisation of segment position data are additional necessities. With this aim, the present work exploits the features of a software tool recently developed within a EU-funded project ('Multimod') in a series of different research studies. Standard and advanced gait analysis on a normal subject, in vivo fluoroscopy-based three-dimensional motion of a replaced knee joint, patellar and ligament tracking on a knee specimen by a surgical navigation system, stem-to-femur migration pattern on a patient operated on total hip replacement, were analysed with standard techniques and all represented by this innovative software tool. Segment pose data were eventually obtained from these different techniques, and were successfully imported and organised in a hierarchical tree within the tool. Skeletal bony segments, prosthesis component models and ligament links were registered successfully to corresponding marker position data for effective three-dimensional animations. These were shown in various combinations, in different views, from different perspectives, according to possible specific research interests. Bioengineering and medical professionals would be much facilitated in the interpretation of the motion analysis measurements necessary in their research fields, and would benefit therefore from this software tool.

  4. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection.

    PubMed

    Kasaie, Parastu; Mathema, Barun; Kelton, W David; Azman, Andrew S; Pennington, Jeff; Dowdy, David W

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional 'n-1' approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the 'n-1' technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the 'n-1' model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models' performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data.

  5. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection

    PubMed Central

    Kasaie, Parastu; Mathema, Barun; Kelton, W. David; Azman, Andrew S.; Pennington, Jeff; Dowdy, David W.

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission (“recent transmission proportion”), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional ‘n-1’ approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the ‘n-1’ technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the ‘n-1’ model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models’ performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data. PMID:26679499

  6. Family doctors' views of pharmaceutical sales representatives: assessment scale development.

    PubMed

    Kersnik, Janko; Klemenc-Ketis, Zalika; Petek-Ster, Marija; Tusek-Bunc, Ksenija; Poplas-Susic, Tonka; Kolsek, Marko

    2011-08-01

    The prescribing patterns depend on the physicians' attitudes and their subjective norms towards prescribing a particular drug, as well as on their personal experience with a particular drug. The physicians are affected by their interactions with pharmaceutical industry. The objectives were to develop a scale for assessment of pharmaceutical sales representatives (PSRs) by the family doctors (FDs) and to determine factors for their evaluation. Cross-sectional anonymous postal study. We included a random sample of 250 Slovenian FDs. Settings. Slovenian FDs' surgeries. The score of various items regarding FDs' assessment of PSRs on a 7-point Likert scale. We got 163 responses (65.2% response rate). The most important characteristic of PSRs, as rated by respondents on the scale from 1 to 7, was the fact that they did not mislead when presenting products' information. The second most important characteristic was the ability to provide objective information about the product. The first three most important characteristics, as rated by the respondents by themselves, were 'Shows good knowledge on the promoted subject', 'Provides objective product information' and 'Makes brief and exact visits'. Cronbach's alpha of the composite scale was 0.844. Factor analysis revealed three PSRs' factors: selling skills, communicating skills and sense of trustworthiness. FDs evaluate PSRs mainly by their managerial skills and trustworthiness. The scale proved to be a reliable tool for assessing PSRs by FDs.

  7. The astronomy of Andean myth: The history of a cosmology

    NASA Astrophysics Data System (ADS)

    Sullivan, William F.

    It is shown that Andean myth, on one level, represents a technical language recording astronomical observations of precession and, at the same time, an historical record of simultaneous social and celestial transformations. Topographic and architectural terms of Andean myth are interpreted as a metaphor for the organization of and locations on the celestial sphere. Via ethoastronomical date, mythical animals are identified as stars and placed on the celestial sphere according to their topographical location. Tested in the planetarium, these arrays generate cluster of dates - 200 B.C. and 650 A.D. Analysis of the names of Wiraqocha and Manco Capac indicates they represent Saturn and Jupiter and that their mythical meeting represents their conjunction in 650 A.D. The astronomy of Andean myth is then used as an historical tool to examine how the Andean priest-astronomers recorded the simultaneous creation of the avllu and of this distinctive astronomical system about 200 B.C. The idea that the agricultural avllu, with its double descent system stressing the importance of paternity, represents a transformation of society from an earlier matrilineal/horticultural era is examined in light of the sexual imagery employed in myth. Wiraqocha's androgyny and the division of the celestial sphere into male (ecliptic) and female (celestial equator = earth) are interpreted as cosmological validations of the new social structure.

  8. Determination Plastic Properties of a Material by Spherical Indentation Base on the Representative Stress Approach

    NASA Astrophysics Data System (ADS)

    Budiarsa, I. N.; Gde Antara, I. N.; Dharma, Agus; Karnata, I. N.

    2018-04-01

    Under an indentation, the material undergoes a complex deformation. One of the most effective ways to analyse indentation has been the representative method. The concept coupled with finite element (FE) modelling has been used successfully in analysing sharp indenters. It is of great importance to extend this method to spherical indentation and associated hardness system. One particular case is the Rockwell B test, where the hardness is determined by two points on the P-h curve of a spherical indenter. In this case, an established link between materials parameters and P-h curves can naturally lead to direct hardness estimation from the materials parameters (e.g. yield stress (y) and work hardening coefficients (n)). This could provide a useful tool for both research and industrial applications. Two method to predict p-h curve in spherical indentation has been established. One is use method using C1-C2 polynomial equation approach and another one by depth approach. Both approach has been successfully. An effective method in representing the P-h curves using a normalized representative stress concept was established. The concept and methodology developed is used to predict hardness (HRB) values of materials through direct analysis and validated with experimental data on selected samples of steel.

  9. Evaluating the State of Water Management in the Rio Grande/Bravo Basin

    NASA Astrophysics Data System (ADS)

    Ortiz Partida, Jose Pablo; Sandoval-Solis, Samuel; Diaz Gomez, Romina

    2017-04-01

    Water resource modeling tools have been developed for many different regions and sub-basins of the Rio Grande/Bravo (RGB). Each of these tools has specific objectives, whether it is to explore drought mitigation alternatives, conflict resolution, climate change evaluation, tradeoff and economic synergies, water allocation, reservoir operations, or collaborative planning. However, there has not been an effort to integrate different available tools, or to link models developed for specific reaches into a more holistic watershed decision-support tool. This project outlines promising next steps to meet long-term goals of improved decision support tools and modeling. We identify, describe, and synthesize water resources management practices in the RGB basin and available water resources models and decision support tools that represent the RGB and the distribution of water for human and environmental uses. The extent body of water resources modeling is examined from a perspective of environmental water needs and water resources management and thereby allows subsequent prioritization of future research and monitoring needs for the development of river system modeling tools. This work communicates the state of the RGB science to diverse stakeholders, researchers, and decision-makers. The products of this project represent a planning tool to support an integrated water resources management framework to maximize economic and social welfare without compromising vital ecosystems.

  10. NASTRAN analysis of Tokamak vacuum vessel using interactive graphics

    NASA Technical Reports Server (NTRS)

    Miller, A.; Badrian, M.

    1978-01-01

    Isoparametric quadrilateral and triangular elements were used to represent the vacuum vessel shell structure. For toroidally symmetric loadings, MPCs were employed across model boundaries and rigid format 24 was invoked. Nonsymmetric loadings required the use of the cyclic symmetry analysis available with rigid format 49. NASTRAN served as an important analysis tool in the Tokamak design effort by providing a reliable means for assessing structural integrity. Interactive graphics were employed in the finite element model generation and in the post-processing of results. It was felt that model generation and checkout with interactive graphics reduced the modelling effort and debugging man-hours significantly.

  11. Artifacts on electroencephalograms may influence the amplitude-integrated EEG classification: a qualitative analysis in neonatal encephalopathy.

    PubMed

    Hagmann, Cornelia Franziska; Robertson, Nicola Jayne; Azzopardi, Denis

    2006-12-01

    This is a case report and a descriptive study demonstrating that artifacts are common during long-term recording of amplitude-integrated electroencephalograms and may lead to erroneous classification of the amplitude-integrated electroencephalogram trace. Artifacts occurred in 12% of 200 hours of recording time sampled from a representative sample of 20 infants with neonatal encephalopathy. Artifacts derived from electrical or movement interference occurred with similar frequency; both types of artifacts influenced the voltage and width of the amplitude-integrated electroencephalogram band. This is important knowledge especially if amplitude-integrated electroencephalogram is used as a selection tool for neuroprotection intervention studies.

  12. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    NASA Astrophysics Data System (ADS)

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  13. Which Way Is the Flow?

    NASA Technical Reports Server (NTRS)

    Kao, David

    1999-01-01

    The line integral convolution (LIC) technique has been known to be an effective tool for depicting flow patterns in a given vector field. There have been many extensions to make it run faster and reveal useful flow information such as velocity magnitude, motion, and direction. There are also extensions to unsteady flows and 3D vector fields. Surprisingly, none of these extensions automatically highlight flow features, which often represent the most important and interesting physical flow phenomena. In this sketch, a method for highlighting flow direction in LIC images is presented. The method gives an intuitive impression of flow direction in the given vector field and automatically reveals saddle points in the flow.

  14. Fitchi: haplotype genealogy graphs based on the Fitch algorithm.

    PubMed

    Matschiner, Michael

    2016-04-15

    : In population genetics and phylogeography, haplotype genealogy graphs are important tools for the visualization of population structure based on sequence data. In this type of graph, node sizes are often drawn in proportion to haplotype frequencies and edge lengths represent the minimum number of mutations separating adjacent nodes. I here present Fitchi, a new program that produces publication-ready haplotype genealogy graphs based on the Fitch algorithm. http://www.evoinformatics.eu/fitchi.htm : michaelmatschiner@mac.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. How to distinguish various components of the SHG signal recorded from the solid/liquid interface?

    NASA Astrophysics Data System (ADS)

    Gassin, Pierre-Marie; Martin-Gassin, Gaelle; Prelot, Benedicte; Zajac, Jerzy

    2016-11-01

    Second harmonic generation (SHG) may be an important tool to probe buried solid/liquid interfaces because of its inherent surface sensitivity. A detailed interpretation of dye adsorption onto Si-SiO2 wafer is not straightforward because both adsorbent and adsorbate contribute to the overall SHG signal. The polarization resolved SHG analysis points out that the adsorbent and adsorbate contributions are out of phase by π/2 in the present system. The surface nonlinear susceptibility χ(2) represents thus a complex tensor in which its real part is related to the adsorbent contribution and its imaginary part to the adsorbate one.

  16. Biological Tools to Study the Effects of Environmental Contaminants at the Feto–Maternal Interface

    PubMed Central

    Mannelli, Chiara; Ietta, Francesca; Avanzati, Anna Maria; Skarzynski, Dariusz

    2015-01-01

    The identification of reproductive toxicants is a major scientific challenge for human health. Prenatal life is the most vulnerable and important time span of human development. For obvious ethical reasons, in vivo models cannot be used in human pregnancy, and animal models do not perfectly reflect human physiology. This review describes the in vitro test models representative of the human feto–maternal interface and the effects of environmental chemicals with estrogen-like activity, mainly bisphenol A and para-nonylphenol, with a particular emphasis on the effects at low, nontoxic doses similar to concentrations commonly detected in the population. PMID:26740808

  17. Immunohistochemical Pitfalls: Common Mistakes in the Evaluation of Lynch Syndrome.

    PubMed

    Markow, Michael; Chen, Wei; Frankel, Wendy L

    2017-12-01

    At least 15% of colorectal cancers diagnosed in the United States are deficient in mismatch repair mechanisms. Most of these are sporadic, but approximately 3% of colorectal cancers result from germline alterations in mismatch repair genes and represent Lynch syndrome. It is critical to identify patients with Lynch syndrome to institute appropriate screening and surveillance for patients and their families. Exclusion of Lynch syndrome in sporadic cases is equally important because it reduces anxiety for patients and prevents excessive spending on unnecessary surveillance. Immunohistochemistry is one of the most widely used screening tools for identifying patients with Lynch syndrome. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  19. Probing the structure of RecA-DNA filaments. Advantages of a fluorescent guanine analog.

    PubMed

    Singleton, Scott F; Roca, Alberto I; Lee, Andrew M; Xiao, Jie

    2007-04-23

    The RecA protein of Escherichia coli plays a crucial roles in DNA recombination and repair, as well as various aspects of bacterial pathogenicity. The formation of a RecA-ATP-ssDNA complex initiates all RecA activities and yet a complete structural and mechanistic description of this filament has remained elusive. An analysis of RecA-DNA interactions was performed using fluorescently labeled oligonucleotides. A direct comparison was made between fluorescein and several fluorescent nucleosides. The fluorescent guanine analog 6-methylisoxanthopterin (6MI) demonstrated significant advantages over the other fluorophores and represents an important new tool for characterizing RecA-DNA interactions.

  20. Open-WiSe: a solar powered wireless sensor network platform.

    PubMed

    González, Apolinar; Aquino, Raúl; Mata, Walter; Ochoa, Alberto; Saldaña, Pedro; Edwards, Arthur

    2012-01-01

    Because battery-powered nodes are required in wireless sensor networks and energy consumption represents an important design consideration, alternate energy sources are needed to provide more effective and optimal function. The main goal of this work is to present an energy harvesting wireless sensor network platform, the Open Wireless Sensor node (WiSe). The design and implementation of the solar powered wireless platform is described including the hardware architecture, firmware, and a POSIX Real-Time Kernel. A sleep and wake up strategy was implemented to prolong the lifetime of the wireless sensor network. This platform was developed as a tool for researchers investigating Wireless sensor network or system integrators.

  1. Software tool for mining liquid chromatography/multi-stage mass spectrometry data for comprehensive glycerophospholipid profiling.

    PubMed

    Hein, Eva-Maria; Bödeker, Bertram; Nolte, Jürgen; Hayen, Heiko

    2010-07-30

    Electrospray ionization mass spectrometry (ESI-MS) has emerged as an indispensable tool in the field of lipidomics. Despite the growing interest in lipid analysis, there are only a few software tools available for data evaluation, as compared for example to proteomics applications. This makes comprehensive lipid analysis a complex challenge. Thus, a computational tool for harnessing the raw data from liquid chromatography/mass spectrometry (LC/MS) experiments was developed in this study and is available from the authors on request. The Profiler-Merger-Viewer tool is a software package for automatic processing of raw-data from data-dependent experiments, measured by high-performance liquid chromatography hyphenated to electrospray ionization hybrid linear ion trap Fourier transform mass spectrometry (FTICR-MS and Orbitrap) in single and multi-stage mode. The software contains three parts: processing of the raw data by Profiler for lipid identification, summarizing of replicate measurements by Merger and visualization of all relevant data (chromatograms as well as mass spectra) for validation of the results by Viewer. The tool is easily accessible, since it is implemented in Java and uses Microsoft Excel (XLS) as output format. The motivation was to develop a tool which supports and accelerates the manual data evaluation (identification and relative quantification) significantly but does not make a complete data analysis within a black-box system. The software's mode of operation, usage and options will be demonstrated on the basis of a lipid extract of baker's yeast (S. cerevisiae). In this study, we focused on three important representatives of lipids: glycerophospholipids, lyso-glycerophospholipids and free fatty acids. Copyright 2010 John Wiley & Sons, Ltd.

  2. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  3. GEOGLAM Crop Assessment Tool: Adapting from global agricultural monitoring to food security monitoring

    NASA Astrophysics Data System (ADS)

    Humber, M. L.; Becker-Reshef, I.; Nordling, J.; Barker, B.; McGaughey, K.

    2014-12-01

    The GEOGLAM Crop Monitor's Crop Assessment Tool was released in August 2013 in support of the GEOGLAM Crop Monitor's objective to develop transparent, timely crop condition assessments in primary agricultural production areas, highlighting potential hotspots of stress/bumper crops. The Crop Assessment Tool allows users to view satellite derived products, best available crop masks, and crop calendars (created in collaboration with GEOGLAM Crop Monitor partners), then in turn submit crop assessment entries detailing the crop's condition, drivers, impacts, trends, and other information. Although the Crop Assessment Tool was originally intended to collect data on major crop production at the global scale, the types of data collected are also relevant to the food security and rangelands monitoring communities. In line with the GEOGLAM Countries at Risk philosophy of "foster[ing] the coordination of product delivery and capacity building efforts for national and regional organizations, and the development of harmonized methods and tools", a modified version of the Crop Assessment Tool is being developed for the USAID Famine Early Warning Systems Network (FEWS NET). As a member of the Countries at Risk component of GEOGLAM, FEWS NET provides agricultural monitoring, timely food security assessments, and early warnings of potential significant food shortages focusing specifically on countries at risk of food security emergencies. While the FEWS NET adaptation of the Crop Assessment Tool focuses on crop production in the context of food security rather than large scale production, the data collected is nearly identical to the data collected by the Crop Monitor. If combined, the countries monitored by FEWS NET and GEOGLAM Crop Monitor would encompass over 90 countries representing the most important regions for crop production and food security.

  4. Speech and language support: How physicians can identify and treat speech and language delays in the office setting

    PubMed Central

    Moharir, Madhavi; Barnett, Noel; Taras, Jillian; Cole, Martha; Ford-Jones, E Lee; Levin, Leo

    2014-01-01

    Failure to recognize and intervene early in speech and language delays can lead to multifaceted and potentially severe consequences for early child development and later literacy skills. While routine evaluations of speech and language during well-child visits are recommended, there is no standardized (office) approach to facilitate this. Furthermore, extensive wait times for speech and language pathology consultation represent valuable lost time for the child and family. Using speech and language expertise, and paediatric collaboration, key content for an office-based tool was developed. The tool aimed to help physicians achieve three main goals: early and accurate identification of speech and language delays as well as children at risk for literacy challenges; appropriate referral to speech and language services when required; and teaching and, thus, empowering parents to create rich and responsive language environments at home. Using this tool, in combination with the Canadian Paediatric Society’s Read, Speak, Sing and Grow Literacy Initiative, physicians will be better positioned to offer practical strategies to caregivers to enhance children’s speech and language capabilities. The tool represents a strategy to evaluate speech and language delays. It depicts age-specific linguistic/phonetic milestones and suggests interventions. The tool represents a practical interim treatment while the family is waiting for formal speech and language therapy consultation. PMID:24627648

  5. Molecular markers in bladder cancer: Novel research frontiers.

    PubMed

    Sanguedolce, Francesca; Cormio, Antonella; Bufo, Pantaleo; Carrieri, Giuseppe; Cormio, Luigi

    2015-01-01

    Bladder cancer (BC) is a heterogeneous disease encompassing distinct biologic features that lead to extremely different clinical behaviors. In the last 20 years, great efforts have been made to predict disease outcome and response to treatment by developing risk assessment calculators based on multiple standard clinical-pathological factors, as well as by testing several molecular markers. Unfortunately, risk assessment calculators alone fail to accurately assess a single patient's prognosis and response to different treatment options. Several molecular markers easily assessable by routine immunohistochemical techniques hold promise for becoming widely available and cost-effective tools for a more reliable risk assessment, but none have yet entered routine clinical practice. Current research is therefore moving towards (i) identifying novel molecular markers; (ii) testing old and new markers in homogeneous patients' populations receiving homogeneous treatments; (iii) generating a multimarker panel that could be easily, and thus routinely, used in clinical practice; (iv) developing novel risk assessment tools, possibly combining standard clinical-pathological factors with molecular markers. This review analyses the emerging body of literature concerning novel biomarkers, ranging from genetic changes to altered expression of a huge variety of molecules, potentially involved in BC outcome and response to treatment. Findings suggest that some of these indicators, such as serum circulating tumor cells and tissue mitochondrial DNA, seem to be easily assessable and provide reliable information. Other markers, such as the phosphoinositide-3-kinase (PI3K)/AKT (serine-threonine kinase)/mTOR (mammalian target of rapamycin) pathway and epigenetic changes in DNA methylation seem to not only have prognostic/predictive value but also, most importantly, represent valuable therapeutic targets. Finally, there is increasing evidence that the development of novel risk assessment tools combining standard clinical-pathological factors with molecular markers represents a major quest in managing this poorly predictable disease.

  6. Predictive Mining of Time Series Data

    NASA Astrophysics Data System (ADS)

    Java, A.; Perlman, E. S.

    2002-05-01

    All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.

  7. Rapid review of cognitive screening instruments in MCI: proposal for a process-based approach modification of overlapping tasks in select widely used instruments.

    PubMed

    Díaz-Orueta, Unai; Blanco-Campal, Alberto; Burke, Teresa

    2018-05-01

    ABSTRACTBackground:A detailed neuropsychological assessment plays an important role in the diagnostic process of Mild Cognitive Impairment (MCI). However, available brief cognitive screening tests for this clinical population are administered and interpreted based mainly, or exclusively, on total achievement scores. This score-based approach can lead to erroneous clinical interpretations unless we also pay attention to the test taking behavior or to the type of errors committed during test performance. The goal of the current study is to perform a rapid review of the literature regarding cognitive screening tools for dementia in primary and secondary care; this will include revisiting previously published systematic reviews on screening tools for dementia, extensive database search, and analysis of individual references cited in selected studies. A subset of representative screening tools for dementia was identified that covers as many cognitive functions as possible. How these screening tools overlap with each other (in terms of the cognitive domains being measured and the method used to assess them) was examined and a series of process-based approach (PBA) modifications for these overlapping features was proposed, so that the changes recommended in relation to one particular cognitive task could be extrapolated to other screening tools. It is expected that future versions of cognitive screening tests, modified using a PBA, will highlight the benefits of attending to qualitative features of test performance when trying to identify subtle features suggestive of MCI and/or dementia.

  8. Integrating ecosystem services analysis into scenario planning practice: accounting for street tree benefits with i-Tree valuation in Central Texas.

    PubMed

    Hilde, Thomas; Paterson, Robert

    2014-12-15

    Scenario planning continues to gain momentum in the United States as an effective process for building consensus on long-range community plans and creating regional visions for the future. However, efforts to integrate more sophisticated information into the analytical framework to help identify important ecosystem services have lagged in practice. This is problematic because understanding the tradeoffs of land consumption patterns on ecological integrity is central to mitigating the environmental degradation caused by land use change and new development. In this paper we describe how an ecosystem services valuation model, i-Tree, was integrated into a mainstream scenario planning software tool, Envision Tomorrow, to assess the benefits of public street trees for alternative future development scenarios. The tool is then applied to development scenarios from the City of Hutto, TX, a Central Texas Sustainable Places Project demonstration community. The integrated tool represents a methodological improvement for scenario planning practice, offers a way to incorporate ecosystem services analysis into mainstream planning processes, and serves as an example of how open source software tools can expand the range of issues available for community and regional planning consideration, even in cases where community resources are limited. The tool also offers room for future improvements; feasible options include canopy analysis of various future land use typologies, as well as a generalized street tree model for broader U.S. application. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Machinability of Minor Wooden Species before and after Modification with Thermo-Vacuum Technology

    PubMed Central

    Sandak, Jakub; Goli, Giacomo; Cetera, Paola; Sandak, Anna; Cavalli, Alberto; Todaro, Luigi

    2017-01-01

    The influence of the thermal modification process on wood machinability was investigated with four minor species of low economic importance. A set of representative experimental samples was machined to the form of disks with sharp and dull tools. The resulting surface quality was visually evaluated by a team of experts according to the American standard procedure ASTM D-1666-87. The objective quantification of the surface quality was also done by means of a three dimensions (3D) surface scanner for the whole range of grain orientations. Visual assessment and 3D surface analysis showed a good agreement in terms of conclusions. The best quality of the wood surface was obtained when machining thermally modified samples. The positive effect of the material modification was apparent when cutting deodar cedar, black pine and black poplar in unfavorable conditions (i.e., against the grain). The difference was much smaller for an easy-machinability specie such as Italian alder. The use of dull tools resulted in the worst surface quality. Thermal modification has shown a very positive effect when machining with dull tools, leading to a relevant increment of the final surface smoothness. PMID:28772480

  10. Reading the lesson: eliciting requirements for a mammography training application

    NASA Astrophysics Data System (ADS)

    Hartswood, M.; Blot, L.; Taylor, P.; Anderson, S.; Procter, R.; Wilkinson, L.; Smart, L.

    2009-02-01

    Demonstrations of a prototype training tool were used to elicit requirements for an intelligent training system for screening mammography. The prototype allowed senior radiologists (mentors) to select cases from a distributed database of images to meet the specific training requirements of junior colleagues (trainees) and then provided automated feedback in response to trainees' attempts at interpretation. The tool was demonstrated to radiologists and radiographers working in the breast screening service at four evaluation sessions. Participants highlighted ease of selecting cases that can deliver specific learning objectives as important for delivering effective training. To usefully structure a large data set of training images we undertook a classification exercise of mentor authored free text 'learning points' attached to training case obtained from two screening centres (n=333, n=129 respectively). We were able to adduce a hierarchy of abstract categories representing classes of lesson that groups of cases were intended to convey (e.g. Temporal change, Misleading juxtapositions, Position of lesion, Typical/Atypical presentation, and so on). In this paper we present the method used to devise this classification, the classification scheme itself, initial user-feedback, and our plans to incorporated it into a software tool to aid case selection.

  11. Machinability of Minor Wooden Species before and after Modification with Thermo-Vacuum Technology.

    PubMed

    Sandak, Jakub; Goli, Giacomo; Cetera, Paola; Sandak, Anna; Cavalli, Alberto; Todaro, Luigi

    2017-01-28

    The influence of the thermal modification process on wood machinability was investigated with four minor species of low economic importance. A set of representative experimental samples was machined to the form of disks with sharp and dull tools. The resulting surface quality was visually evaluated by a team of experts according to the American standard procedure ASTM D-1666-87. The objective quantification of the surface quality was also done by means of a three dimensions (3D) surface scanner for the whole range of grain orientations. Visual assessment and 3D surface analysis showed a good agreement in terms of conclusions. The best quality of the wood surface was obtained when machining thermally modified samples. The positive effect of the material modification was apparent when cutting deodar cedar, black pine and black poplar in unfavorable conditions (i.e., against the grain). The difference was much smaller for an easy-machinability specie such as Italian alder. The use of dull tools resulted in the worst surface quality. Thermal modification has shown a very positive effect when machining with dull tools, leading to a relevant increment of the final surface smoothness.

  12. A Non-technical User-Oriented Display Notation for XACML Conditions

    NASA Astrophysics Data System (ADS)

    Stepien, Bernard; Felty, Amy; Matwin, Stan

    Ideally, access control to resources in complex IT systems ought to be handled by business decision makers who own a given resource (e.g., the pay and benefits section of an organization should decide and manage the access rules to the payroll system). To make this happen, the security and database communities need to develop vendor-independent access management tools, useable by decision makers, rather than technical personnel detached from a given business function. We have developed and implemented such tool, based on XACML. The XACML is an important emerging tool for managing complex access control applications. As a formal notation, based on an XML schema representing the grammar of a given application, XACML is precise and non-ambiguous. But this very property puts it out of reach of non-technical users. We propose a new notation for displaying and editing XACML rules that is independent of XML, and we develop an editor for it. Our notation combines a tree representation of logical expressions with an accessible natural language layer. Our early experience indicates that such rules can be grasped by non-technical users wishing to develop and control rules for accessing their own resources.

  13. Laser Surface Modification of H13 Die Steel using Different Laser Spot Sizes

    NASA Astrophysics Data System (ADS)

    Aqida, S. N.; Naher, S.; Brabazon, D.

    2011-05-01

    This paper presents a laser surface modification process of AISI H13 tool steel using three sizes of laser spot with an aim to achieve reduced grain size and surface roughness. A Rofin DC-015 diffusion-cooled CO2 slab laser was used to process AISI H13 tool steel samples. Samples of 10 mm diameter were sectioned to 100 mm length in order to process a predefined circumferential area. The parameters selected for examination were laser peak power, overlap percentage and pulse repetition frequency (PRF). Metallographic study and image analysis were done to measure the grain size and the modified surface roughness was measured using two-dimensional surface profilometer. From metallographic study, the smallest grain sizes measured by laser modified surface were between 0.51 μm and 2.54 μm. The minimum surface roughness, Ra, recorded was 3.0 μm. This surface roughness of the modified die steel is similar to the surface quality of cast products. The grain size correlation with hardness followed the findings correlate with Hall-Petch relationship. The potential found for increase in surface hardness represents an important method to sustain tooling life.

  14. Design and evaluation of a software prototype for participatory planning of environmental adaptations.

    PubMed

    Eriksson, J; Ek, A; Johansson, G

    2000-03-01

    A software prototype to support the planning process for adapting home and work environments for people with physical disabilities was designed and later evaluated. The prototype exploits low-cost three-dimensional (3-D) graphics products in the home computer market. The essential features of the prototype are: interactive rendering with optional hardware acceleration, interactive walk-throughs, direct manipulation tools for moving objects and measuring distances, and import of 3-D-objects from a library. A usability study was conducted, consisting of two test sessions (three weeks apart) and a final interview. The prototype was then tested and evaluated by representatives of future users: five occupational therapist students, and four persons with physical disability, with no previous experience of the prototype. Emphasis in the usability study was placed on the prototype's efficiency and learnability. We found that it is possible to realise a planning tool for environmental adaptations, both regarding usability and technical efficiency. The usability evaluation confirms our findings from previous case studies, regarding the relevance and positive attitude towards this kind of planning tool. Although the prototype was found to be satisfactorily efficient for the basic tasks, the paper presents several suggestions for improvement of future prototype versions.

  15. A new spiral dental implant: a tool for oral rehabilitation of difficult cases

    PubMed Central

    BALAN, I.; CALCATERRA, R.; LAURITANO, D.; GRECCHI, E.; CARINCI, F.

    2017-01-01

    SUMMARY Spiral dental implant (SDI) is an implant with a conical internal helix that confers the characteristic of self-drilling, self-tapping, and self-bone condensing. These proprieties offer better control during insertion of SDI giving a high primary stabilization, even in poor quality bone. A shorter diameter of SDI results in reduced drilling during insertion and consequently less trauma and minimal bone loss. To address the research purpose, the investigators designed a retrospective cohort study. The study population was composed of 25 patients, 11 males and 14 females that have been treated by Dr. Balan with 187 SDI positioned in mandible and into maxilla bone. The implants were placed during the years 2013 to 2014 in Dr. Balan clinic. All patients underwent the same surgical protocol. Several variables are investigated: demographic (age and gender), anatomic (upper/lower jaws and tooth site), implant (length and diameter and type) variables, edentulism (partial or total), and comorbid status of health (i.e.: hypothyroidism, parodontitis, hypertension, diabetes, presence of cancer, heart disease, hepatitis and rheumatologic disease). Pearson Chi-Square test was used to investigate variables and p < 0.05 was considered statistically significant. Statistically it has been shown that females have a higher possibility of unsuccessful respect of male, with a “p value” of 0.014. Another important impact factor for success of implant insertion has been represented by concomitants pathologies: cancer represents the most negative high factor risk with a percentage of unsuccessful of 50%, followed by heart disease (15%), and diabetes (3.7%). SDIs are reliable tools for difficult cases of oral rehabilitation. They have a higher success and survival rate, which means stable results over time. No differences were detected among SDI lengths, implant/crown ratio. In addition, the insertion of SDIs in banked bone can be performed without adverse effects. Finally, flapless and computer tomography-planned surgery does not significantly increase the clinical outcome of SDIs in complex rehabilitation. Cancer represents the most important variable to consider when a patient wants to do oral rehabilitation because of its high risk of unsuccessful. PMID:29285328

  16. A new spiral dental implant: a tool for oral rehabilitation of difficult cases.

    PubMed

    Balan, I; Calcaterra, R; Lauritano, D; Grecchi, E; Carinci, F

    2017-01-01

    Spiral dental implant (SDI) is an implant with a conical internal helix that confers the characteristic of self-drilling, self-tapping, and self-bone condensing. These proprieties offer better control during insertion of SDI giving a high primary stabilization, even in poor quality bone. A shorter diameter of SDI results in reduced drilling during insertion and consequently less trauma and minimal bone loss. To address the research purpose, the investigators designed a retrospective cohort study. The study population was composed of 25 patients, 11 males and 14 females that have been treated by Dr. Balan with 187 SDI positioned in mandible and into maxilla bone. The implants were placed during the years 2013 to 2014 in Dr. Balan clinic. All patients underwent the same surgical protocol. Several variables are investigated: demographic (age and gender), anatomic (upper/lower jaws and tooth site), implant (length and diameter and type) variables, edentulism (partial or total), and comorbid status of health (i.e.: hypothyroidism, parodontitis, hypertension, diabetes, presence of cancer, heart disease, hepatitis and rheumatologic disease). Pearson Chi-Square test was used to investigate variables and p < 0.05 was considered statistically significant. Statistically it has been shown that females have a higher possibility of unsuccessful respect of male, with a "p value" of 0.014. Another important impact factor for success of implant insertion has been represented by concomitants pathologies: cancer represents the most negative high factor risk with a percentage of unsuccessful of 50%, followed by heart disease (15%), and diabetes (3.7%). SDIs are reliable tools for difficult cases of oral rehabilitation. They have a higher success and survival rate, which means stable results over time. No differences were detected among SDI lengths, implant/crown ratio. In addition, the insertion of SDIs in banked bone can be performed without adverse effects. Finally, flapless and computer tomography-planned surgery does not significantly increase the clinical outcome of SDIs in complex rehabilitation. Cancer represents the most important variable to consider when a patient wants to do oral rehabilitation because of its high risk of unsuccessful.

  17. Framing the conversation: use of PRECIS-2 ratings to advance understanding of pragmatic trial design domains.

    PubMed

    Lipman, Paula Darby; Loudon, Kirsty; Dluzak, Leanora; Moloney, Rachael; Messner, Donna; Stoney, Catherine M

    2017-11-10

    There continues to be debate about what constitutes a pragmatic trial and how it is distinguished from more traditional explanatory trials. The NIH Pragmatic Trials Collaborative Project, which includes five trials and a coordinating unit, has adopted the Pragmatic-Explanatory Continuum Indicator Summary (PRECIS-2) instrument. The purpose of the study was to collect PRECIS-2 ratings at two points in time to assess whether the tool was sensitive to change in trial design, and to explore with investigators the rationale for rating shifts. A mixed-methods design included sequential collection and analysis of quantitative data (PRECIS-2 ratings) and qualitative data. Ratings were collected at two annual, in-person project meetings, and subsequent interviews conducted with investigators were recorded, transcribed, and coded using NVivo 11 Pro for Windows. Rating shifts were coded as either (1) actual change (reflects a change in procedure or protocol), (2) primarily a rating shift reflecting rater variability, or (3) themes that reflect important concepts about the tool and/or pragmatic trial design. Based on PRECIS-2 ratings, each trial was highly pragmatic at the planning phase and remained so 1 year later in the early phases of trial implementation. Over half of the 45 paired ratings for the nine PRECIS-2 domains indicated a rating change from Time 1 to Time 2 (N = 24, 53%). Of the 24 rating changes, only three represented a true change in the design of the trial. Analysis of rationales for rating shifts identified critical themes associated with the tool or pragmatic trial design more generally. Each trial contributed one or more relevant comments, with Eligibility, Flexibility of Adherence, and Follow-up each accounting for more than one. PRECIS-2 has proved useful for "framing the conversation" about trial design among members of the Pragmatic Trials Collaborative Project. Our findings suggest that design elements assessed by the PRECIS-2 tool may represent mostly stable decisions. Overall, there has been a positive response to using PRECIS-2 to guide conversations around trial design, and the project's focus on the use of the tool by this group of early adopters has provided valuable feedback to inform future trainings on the tool.

  18. River dolphins can act as population trend indicators in degraded freshwater systems.

    PubMed

    Turvey, Samuel T; Risley, Claire L; Barrett, Leigh A; Yujiang, Hao; Ding, Wang

    2012-01-01

    Conservation attention on charismatic large vertebrates such as dolphins is often supported by the suggestion that these species represent surrogates for wider biodiversity, or act as indicators of ecosystem health. However, their capacity to act as indicators of patterns or trends in regional biodiversity has rarely been tested. An extensive new dataset of >300 last-sighting records for the Yangtze River dolphin or baiji and two formerly economically important fishes, the Yangtze paddlefish and Reeves' shad, all of which are probably now extinct in the Yangtze, was collected during an interview survey of fishing communities across the middle-lower Yangtze drainage. Untransformed last-sighting date frequency distributions for these species show similar decline curves over time, and the linear gradients of transformed last-sighting date series are not significantly different from each other, demonstrating that these species experienced correlated population declines in both timing and rate of decline. Whereas species may be expected to respond differently at the population level even in highly degraded ecosystems, highly vulnerable (e.g. migratory) species can therefore display very similar responses to extrinsic threats, even if they represent otherwise very different taxonomic, biological and ecological groupings. Monitoring the status of river dolphins or other megafauna therefore has the potential to provide wider information on the status of other threatened components of sympatric freshwater biotas, and so represents a potentially important monitoring tool for conservation management. We also show that interview surveys can provide robust quantitative data on relative population dynamics of different species.

  19. Musculoskeletal impairment survey in Rwanda: Design of survey tool, survey methodology, and results of the pilot study (a cross sectional survey)

    PubMed Central

    Atijosan, Oluwarantimi; Kuper, Hannah; Rischewski, Dorothea; Simms, Victoria; Lavy, Christopher

    2007-01-01

    Background Musculoskeletal impairment (MSI) is an important cause of morbidity and mortality worldwide, especially in developing countries. Prevalence studies for MSI in the developing world have used varying methodologies and are seldom directly comparable. This study aimed to develop a new tool to screen for and diagnose MSI and to pilot test the methodology for a national survey in Rwanda. Methods A 7 question screening tool to identify cases of MSI was developed through literature review and discussions with healthcare professionals. To validate the tool, trained rehabilitation technicians screened 93 previously identified gold standard 'cases' and 86 'non cases'. Sensitivity, specificity and positive predictive value were calculated. A standardised examination protocol was developed to determine the aetiology and diagnosis of MSI for those who fail the screening test. For the national survey in Rwanda, multistage cluster random sampling, with probability proportional to size procedures will be used for selection of a cross-sectional, nationally representative sample of the population. Households to be surveyed will be chosen through compact segment sampling and all individuals within chosen households will be screened. A pilot survey of 680 individuals was conducted using the protocol. Results: The screening tool demonstrated 99% sensitivity and 97% specificity for MSI, and a positive predictive value of 98%. During the pilot study 468 out of 680 eligible subjects (69%) were screened. 45 diagnoses were identified in 38 persons who were cases of MSI. The subjects were grouped into categories based on diagnostic subgroups of congenital (1), traumatic (17), infective (2) neurological (6) and other acquired(19). They were also separated into mild (42.1%), moderate (42.1%) and severe (15.8%) cases, using an operational definition derived from the World Health Organisation's International Classification of Functioning, Disability and Health. Conclusion: The screening tool had good sensitivity and specificity and was appropriate for use in a national survey. The pilot study showed that the survey protocol was appropriate for measuring the prevalence of MSI in Rwanda. This survey is an important step to building a sound epidemiological understanding of MSI, to enable appropriate health service planning. PMID:17391509

  20. A proposed-standard format to represent and distribute tomographic models and other earth spatial data

    NASA Astrophysics Data System (ADS)

    Postpischl, L.; Morelli, A.; Danecek, P.

    2009-04-01

    Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured) computational meshes. This approach can exploit the capabilities of the web browser as a computing platform: a series of in-page quick tools for comparative analysis between models will be presented, as well as visualisation techniques for tomographic layers in Google Maps and Google Earth. We are working on tools for conversion into common scientific format like netCDF, to allow easy visualisation in GEON-IDV or gmt.

  1. Presenting Numerical Modelling of Explosive Volcanic Eruption to a General Public

    NASA Astrophysics Data System (ADS)

    Demaria, C.; Todesco, M.; Neri, A.; Blasi, G.

    2001-12-01

    Numerical modeling of explosive volcanic eruptions has been widely applied, during the last decades, to study pyroclastic flows dispersion along volcano's flanks and to evaluate their impact on urban areas. Results from these transient multi-phase and multi-component simulations are often reproduced in form of computer animations, representing the spatial and temporal evolution of relevant flow variables (such as temperature, or particle concentration). Despite being a sophisticated, technical tool to analyze and share modeling results within the scientific community, these animations truly look like colorful cartoons showing an erupting volcano and are especially suited to be shown to a general public. Thanks to their particular appeal, and to the large interest usually risen by exploding volcanoes, these animations have been presented several times on television and magazines and are currently displayed in a permanent exposition, at the Vesuvius Observatory in Naples. This work represents an effort to produce an accompanying tool for these animations, capable of explaining to a large audience the scientific meaning of what can otherwise look as a graphical exercise. Dealing with research aimed at the study of dangerous, explosive volcanoes, improving the general understanding of these scientific results plays an important role as far as risk perception is concerned. An educated population has better chances to follow an appropriate behavior, i.e.: one that could lead, on the long period, to a reduction of the potential risk. In this sense, a correct divulgation of scientific results, while improving the confidence of the population in the scientific community, should belong to the strategies adopted to mitigate volcanic risk. Due to the relevance of the long term final goal of such divulgation experiment, this work represents an interdisciplinary effort, combining scientific expertise and specific competence from the modern communication science and risk perception studies.

  2. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  3. Disentangling representations of shape and action components in the tool network.

    PubMed

    Wang, Xiaoying; Zhuang, Tonghe; Shen, Jiasi; Bi, Yanchao

    2018-05-30

    Shape and how they should be used are two key components of our knowledge about tools. Viewing tools preferentially activated a frontoparietal and occipitotemporal network, with dorsal regions implicated in computation of tool-related actions and ventral areas in shape representation. As shape and manners of manipulation are highly correlated for daily tools, whether they are independently represented in different regions remains inconclusive. In the current study, we collected fMRI data when participants viewed blocks of pictures of four daily tools (i.e., paintbrush, corkscrew, screwdriver, razor) where shape and action (manner of manipulation for functional use) were orthogonally manipulated, to tease apart these two dimensions. Behavioral similarity judgments tapping on object shape and finer aspects of actions (i.e., manners of motion, magnitude of arm movement, configuration of hand) were also collected to further disentangle the representation of object shape and different action components. Information analysis and representational similarity analysis were conducted on regional neural activation patterns of the tool-preferring network. In both analyses, the bilateral lateral occipitotemporal cortex showed robust shape representations but could not effectively distinguish between tool-use actions. The frontal and precentral regions represented kinematic action components, whereas the left parietal region (in information analyses) exhibited coding of both shape and tool-use action. By teasing apart shape and action components, we found both dissociation and association of them within the tool network. Taken together, our study disentangles representations for object shape from finer tool-use action components in the tool network, revealing the potential dissociable roles different tool-preferring regions play in tool processing. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Quantitative assessment of alkali-reactive aggregate mineral content through XRD using polished sections as a supplementary tool to RILEM AAR-1 (petrographic method)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castro, Nelia, E-mail: nelia.castro@ntnu.no; Sorensen, Bjorn E.; Broekmans, Maarten A.T.M.

    The mineral content of 5 aggregate samples from 4 different countries, including reactive and non-reactive aggregate types, was assessed quantitatively by X-ray diffraction (XRD) using polished sections. Additionally, electron probe microanalyzer (EPMA) mapping and cathodoluminescence (CL) were used to characterize the opal-CT identified in one of the aggregate samples. Critical review of results from polished sections against traditionally powdered specimen has demonstrated that for fine-grained rocks without preferred orientation the assessment of mineral content by XRD using polished sections may represent an advantage over traditional powder specimens. Comparison of data on mineral content and silica speciation with expansion data frommore » PARTNER project confirmed that the presence of opal-CT plays an important role in the reactivity of one of the studied aggregates. Used as a complementary tool to RILEM AAR-1, the methodology suggested in this paper has the potential to improve the strength of the petrographic method.« less

  5. GRACE, time-varying gravity, Earth system dynamics and climate change

    NASA Astrophysics Data System (ADS)

    Wouters, B.; Bonin, J. A.; Chambers, D. P.; Riva, R. E. M.; Sasgen, I.; Wahr, J.

    2014-11-01

    Continuous observations of temporal variations in the Earth's gravity field have recently become available at an unprecedented resolution of a few hundreds of kilometers. The gravity field is a product of the Earth's mass distribution, and these data—provided by the satellites of the Gravity Recovery And Climate Experiment (GRACE)—can be used to study the exchange of mass both within the Earth and at its surface. Since the launch of the mission in 2002, GRACE data has evolved from being an experimental measurement needing validation from ground truth, to a respected tool for Earth scientists representing a fixed bound on the total change and is now an important tool to help unravel the complex dynamics of the Earth system and climate change. In this review, we present the mission concept and its theoretical background, discuss the data and give an overview of the major advances GRACE has provided in Earth science, with a focus on hydrology, solid Earth sciences, glaciology and oceanography.

  6. Culture and Demography: From Reluctant Bedfellows to Committed Partners

    PubMed Central

    Bachrach, Christine A.

    2015-01-01

    Demography and culture have had a long but ambivalent relationship. Cultural influences are widely recognized as important for demographic outcomes, but are often “backgrounded” in demographic research. I argue that progress towards a more successful integration is feasible and suggest a network model of culture as a potential tool. The network model bridges both traditional (holistic and institutional) and contemporary (tool kit) models of culture used in the social sciences and offers a simple vocabulary for the diverse set of cultural concepts such as attitudes, beliefs and norms, and quantitative measures of how culture is organized. The proposed model conceptualizes culture as a nested network of meanings which are represented by schemas that range in complexity from simple concepts to multifaceted cultural models. I illustrate the potential value of a model using accounts of the cultural changes underpinning the transformation of marriage in the U.S. and point to developments in the social, cognitive and computational sciences that could facilitate the application of the model in empirical demographic research. PMID:24338643

  7. Image-based deep learning for classification of noise transients in gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Razzano, Massimiliano; Cuoco, Elena

    2018-05-01

    The detection of gravitational waves has inaugurated the era of gravitational astronomy and opened new avenues for the multimessenger study of cosmic sources. Thanks to their sensitivity, the Advanced LIGO and Advanced Virgo interferometers will probe a much larger volume of space and expand the capability of discovering new gravitational wave emitters. The characterization of these detectors is a primary task in order to recognize the main sources of noise and optimize the sensitivity of interferometers. Glitches are transient noise events that can impact the data quality of the interferometers and their classification is an important task for detector characterization. Deep learning techniques are a promising tool for the recognition and classification of glitches. We present a classification pipeline that exploits convolutional neural networks to classify glitches starting from their time-frequency evolution represented as images. We evaluated the classification accuracy on simulated glitches, showing that the proposed algorithm can automatically classify glitches on very fast timescales and with high accuracy, thus providing a promising tool for online detector characterization.

  8. Ensuring Sample Quality for Biomarker Discovery Studies - Use of ICT Tools to Trace Biosample Life-cycle.

    PubMed

    Riondino, Silvia; Ferroni, Patrizia; Spila, Antonella; Alessandroni, Jhessica; D'Alessandro, Roberta; Formica, Vincenzo; Della-Morte, David; Palmirotta, Raffaele; Nanni, Umberto; Roselli, Mario; Guadagni, Fiorella

    2015-01-01

    The growing demand of personalized medicine marked the transition from an empirical medicine to a molecular one, aimed at predicting safer and more effective medical treatment for every patient, while minimizing adverse effects. This passage has emphasized the importance of biomarker discovery studies, and has led sample availability to assume a crucial role in biomedical research. Accordingly, a great interest in Biological Bank science has grown concomitantly. In biobanks, biological material and its accompanying data are collected, handled and stored in accordance with standard operating procedures (SOPs) and existing legislation. Sample quality is ensured by adherence to SOPs and sample whole life-cycle can be recorded by innovative tracking systems employing information technology (IT) tools for monitoring storage conditions and characterization of vast amount of data. All the above will ensure proper sample exchangeability among research facilities and will represent the starting point of all future personalized medicine-based clinical trials. Copyright© 2015, International Institute of Anticancer Research (Dr. John G. Delinasios), All rights reserved.

  9. Automatically assisting human memory: a SenseCam browser.

    PubMed

    Doherty, Aiden R; Moulin, Chris J A; Smeaton, Alan F

    2011-10-01

    SenseCams have many potential applications as tools for lifelogging, including the possibility of use as a memory rehabilitation tool. Given that a SenseCam can log hundreds of thousands of images per year, it is critical that these be presented to the viewer in a manner that supports the aims of memory rehabilitation. In this article we report a software browser constructed with the aim of using the characteristics of memory to organise SenseCam images into a form that makes the wealth of information stored on SenseCam more accessible. To enable a large amount of visual information to be easily and quickly assimilated by a user, we apply a series of automatic content analysis techniques to structure the images into "events", suggest their relative importance, and select representative images for each. This minimises effort when browsing and searching. We provide anecdotes on use of such a system and emphasise the need for SenseCam images to be meaningfully sorted using such a browser.

  10. Investigation of priorities in water quality management based on correlations and variations.

    PubMed

    Boyacıoğlu, Hülya; Gündogdu, Vildan; Boyacıoğlu, Hayal

    2013-04-15

    The development of water quality assessment strategies investigating spatial and temporal changes caused by natural and anthropogenic phenomena is an important tool in management practices. This paper used cluster analysis, water quality index method, sensitivity analysis and canonical correlation analysis to investigate priorities in pollution control activities. Data sets representing 22 surface water quality parameters were subject to analysis. Results revealed that organic pollution was serious threat for overall water quality in the region. Besides, oil and grease, lead and mercury were the critical variables violating the standard. In contrast to inorganic variables, organic and physical-inorganic chemical parameters were influenced by variations in physical conditions (discharge, temperature). This study showed that information produced based on the variations and correlations in water quality data sets can be helpful to investigate priorities in water management activities. Moreover statistical techniques and index methods are useful tools in data - information transformation process. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Immunogenomics of gastrointestinal nematode infection in ruminants - breeding for resistance to produce food sustainably and safely.

    PubMed

    Sweeney, T; Hanrahan, J P; Ryan, M T; Good, B

    2016-09-01

    Gastrointestinal nematode (GIN) infection of ruminants represents a major health and welfare challenge for livestock producers worldwide. The emergence of anthelmintic resistance in important GIN species and the associated animal welfare concerns have stimulated interest in the development of alternative and more sustainable strategies aimed at the effective management of the impact of GINs. These integrative strategies include selective breeding using genetic/genomic tools, grazing management, biological control, nutritional supplementation, vaccination and targeted selective treatment. In this review, the logic of selecting for "resistance" to GIN infection as opposed to "resilience" or "tolerance" is discussed. This is followed by a review of the potential application of immunogenomics to genetic selection for animals that have the capacity to withstand the impact of GIN infection. Advances in relevant genomic technologies are highlighted together with how these tools can be advanced to support the integration of immunogenomic information into ruminant breeding programmes. © 2016 John Wiley & Sons Ltd.

  12. Commentary on two classroom observation systems: moving toward a shared understanding of effective teaching.

    PubMed

    Connor, Carol McDonald

    2013-12-01

    In this commentary, I make five points: that designing observation systems that actually predict students' outcomes is challenging; second that systems that capture the complex and dynamic nature of the classroom learning environment are more likely to be able to meet this challenge; three, that observation tools are most useful when developed to serve a particular purpose and are put to that purpose; four that technology can help; and five, there are policy implications for valid and reliable classroom observation tools. The two observation systems presented in this special issue represent an important step forward and a move toward policy that promises to make a true difference in what is defined as high quality and effective teaching, what it looks like in the classroom, and how these practices can be more widely disseminated so that all children, including those attending under-resourced schools, can experience effective instruction, academic success and the lifelong accomplishment that follows. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  13. New advances in lower gastrointestinal bleeding management with embolotherapy

    PubMed Central

    Ierardi, Anna Maria; Urbano, Josè; De Marchi, Giuseppe; Micieli, Camilla; Duka, Ejona; Iacobellis, Francesca; Fontana, Federico

    2016-01-01

    Lower gastrointestinal bleeding (LGIB) is associated with high morbidity and mortality. Embolization is currently proposed as the first step in the treatment of acute, life-threatening LGIB, when endoscopic approach is not possible or is unsuccessful. Like most procedures performed in emergency setting, time represents a significant factor influencing outcome. Modern tools permit identifying and reaching the bleeding site faster than two-dimensional angiography. Non-selective cone-beam CT arteriography can identify a damaged vessel. Moreover, sophisticated software able to detect the vessel may facilitate direct placement of a microcatheter into the culprit vessel without the need for sequential angiography. A further important aspect is the use of an appropriate technique of embolization and a safe and effective embolic agent. Current evidence shows the use of detachable coils (with or without a triaxial system) and liquid embolics has proven advantages compared with other embolic agents. The present article analyses these modern tools, making embolization of acute LGIB safer and more effective. PMID:26764281

  14. IntellEditS: intelligent learning-based editor of segmentations.

    PubMed

    Harrison, Adam P; Birkbeck, Neil; Sofka, Michal

    2013-01-01

    Automatic segmentation techniques, despite demonstrating excellent overall accuracy, can often produce inaccuracies in local regions. As a result, correcting segmentations remains an important task that is often laborious, especially when done manually for 3D datasets. This work presents a powerful tool called Intelligent Learning-Based Editor of Segmentations (IntellEditS) that minimizes user effort and further improves segmentation accuracy. The tool partners interactive learning with an energy-minimization approach to editing. Based on interactive user input, a discriminative classifier is trained and applied to the edited 3D region to produce soft voxel labeling. The labels are integrated into a novel energy functional along with the existing segmentation and image data. Unlike the state of the art, IntellEditS is designed to correct segmentation results represented not only as masks but also as meshes. In addition, IntellEditS accepts intuitive boundary-based user interactions. The versatility and performance of IntellEditS are demonstrated on both MRI and CT datasets consisting of varied anatomical structures and resolutions.

  15. ESTEEM: A Novel Framework for Qualitatively Evaluating and Visualizing Spatiotemporal Embeddings in Social Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin L.; Volkova, Svitlana

    Analyzing and visualizing large amounts of social media communications and contrasting short-term conversation changes over time and geo-locations is extremely important for commercial and government applications. Earlier approaches for large-scale text stream summarization used dynamic topic models and trending words. Instead, we rely on text embeddings – low-dimensional word representations in a continuous vector space where similar words are embedded nearby each other. This paper presents ESTEEM,1 a novel tool for visualizing and evaluating spatiotemporal embeddings learned from streaming social media texts. Our tool allows users to monitor and analyze query words and their closest neighbors with an interactive interface.more » We used state-of- the-art techniques to learn embeddings and developed a visualization to represent dynamically changing relations between words in social media over time and other dimensions. This is the first interactive visualization of streaming text representations learned from social media texts that also allows users to contrast differences across multiple dimensions of the data.« less

  16. GRACE, time-varying gravity, Earth system dynamics and climate change.

    PubMed

    Wouters, B; Bonin, J A; Chambers, D P; Riva, R E M; Sasgen, I; Wahr, J

    2014-11-01

    Continuous observations of temporal variations in the Earth's gravity field have recently become available at an unprecedented resolution of a few hundreds of kilometers. The gravity field is a product of the Earth's mass distribution, and these data-provided by the satellites of the Gravity Recovery And Climate Experiment (GRACE)-can be used to study the exchange of mass both within the Earth and at its surface. Since the launch of the mission in 2002, GRACE data has evolved from being an experimental measurement needing validation from ground truth, to a respected tool for Earth scientists representing a fixed bound on the total change and is now an important tool to help unravel the complex dynamics of the Earth system and climate change. In this review, we present the mission concept and its theoretical background, discuss the data and give an overview of the major advances GRACE has provided in Earth science, with a focus on hydrology, solid Earth sciences, glaciology and oceanography.

  17. Multidirectional Translation of Environmental Health Science in Community Settings: The Case of Oxidative Stress Pathways.

    PubMed

    Sampson, Natalie R; Tetteh, Myra M; Schulz, Amy J; Ramirez, Erminia; Wilkins, Donele; de Majo, Ricardo; Mentz, Graciela; Johnson-Lawrence, Vicki

    2016-01-01

    Translation of environmental health science in vulnerable communities is particularly important to promote public health and reduce health inequities. We describe a structured, multidirectional process used to develop a suite of health promotion tools (e.g., fact sheets, video, maps) documenting patterning of local air pollution sources and availability of antioxidant-rich foods in Detroit, Michigan as factors that jointly affect oxidative stress (OS). OS underlies many pathological processes associated with air pollution, including asthma, metabolic syndrome, cancer, diabetes, and obesity. This translational effort involved a 2-year dialogue among representatives from community-based and environmental organizations, health service providers, and academic researchers. This dialogue led to development of tools, as well as new opportunities to inform related policies and research. Through this example, we highlight how collaborative partnerships can enhance multidirectional dialogue to inform translation of environmental health science by promoting consideration of multilevel risk factors, local priorities and context, and diverse audiences.

  18. Prioritizing therapeutic targets using patient-derived xenograft models

    PubMed Central

    Lodhia, K.A; Hadley, A; Haluska, P; Scott, C.L

    2015-01-01

    Effective systemic treatment of cancer relies on the delivery of agents with optimal therapeutic potential. The molecular age of medicine has provided genomic tools that can identify a large number of potential therapeutic targets in individual patients, heralding the promise of personalized treatment. However, determining which potential targets actually drive tumor growth and should be prioritized for therapy is challenging. Indeed, reliable molecular matches of target and therapeutic agent have been stringently validated in the clinic for only a small number of targets. Patient-derived xenografts (PDX) are tumor models developed in immunocompromised mice using tumor procured directly from the patient. As patient surrogates, PDX models represent a powerful tool for addressing individualized therapy. Challenges include humanizing the immune system of PDX models and ensuring high quality molecular annotation, in order to maximise insights for the clinic. Importantly, PDX can be sampled repeatedly and in parallel, to reveal clonal evolution, which may predict mechanisms of drug resistance and inform therapeutic strategy design. PMID:25783201

  19. [Ischemic origin of diabetic foot disease. Epidemiology, difficulties of diagnosis, options for prevention and revascularization].

    PubMed

    Kolossváry, Endre; Bánsághi, Zoltán; Szabó, Gábor Viktor; Járai, Zoltán; Farkas, Katalin

    2017-02-01

    "Diabetic foot" as definition covers a multifactorial clinical condition. According to the recent epidemiological data, the role of lower limb ischemia is getting more influential over other pathological causes, like neuropathy, infections and bone or soft tissue deformity. In diabetes, vascular disease leads to increased risk for leg ulcers and minor or major amputations. The traditional diagnostic tools for recognition of peripheral arterial disease have limited value because of diabetes specific clinical manifestations. Available vascular centers with special expertise and diagnostic tools are the prerequisite for efficient diagnosis supporting timely recognition of peripheral arterial disease. In course of treatment of diabetic foot with ischemic origin, beyond effective medical treatment revascularization (open vascular surgery or endovascular procedures) has paramount importance for prevention of limb loss. Vascular teams of vascular specialists, vascular surgeons and interventional radiologist in dedicated centers in multidisciplinary cooperation with other professions represent public health issue in effective prevention. Orv. Hetil., 2017, 158(6), 203-211.

  20. The Assessment of Bipolar Disorder in Children and Adolescents

    PubMed Central

    Youngstrom, Eric A.; Freeman, Andrew J.; Jenkins, Melissa McKeown

    2010-01-01

    The overarching goal of this review is to examine the current best evidence for assessing bipolar disorder in children and adolescents and provide a comprehensive, evidence-based approach to diagnosis. Evidence-based assessment strategies are organized around the “3 Ps” of clinical assessment: Predict important criteria or developmental trajectories, Prescribe a change in treatment choice, and inform Process of treating the youth and his/her family. The review characterizes bipolar disorder in youths - specifically addressing bipolar diagnoses and clinical subtypes; then provides an actuarial approach to assessment - using prevalence of disorder, risk factors, and questionnaires; discusses treatment thresholds; and identifies practical measures of process and outcomes. The clinical tools and risk factors selected for inclusion in this review represent the best empirical evidence in the literature. By the end of the review, clinicians will have a framework and set of clinically useful tools with which to effectively make evidence-based decisions regarding the diagnosis of bipolar disorder in children and adolescents. PMID:19264268

  1. Can we use high precision metal isotope analysis to improve our understanding of cancer?

    PubMed

    Larner, Fiona

    2016-01-01

    High precision natural isotope analyses are widely used in geosciences to trace elemental transport pathways. The use of this analytical tool is increasing in nutritional and disease-related research. In recent months, a number of groups have shown the potential this technique has in providing new observations for various cancers when applied to trace metal metabolism. The deconvolution of isotopic signatures, however, relies on mathematical models and geochemical data, which are not representative of the system under investigation. In addition to relevant biochemical studies of protein-metal isotopic interactions, technological development both in terms of sample throughput and detection sensitivity of these elements is now needed to translate this novel approach into a mainstream analytical tool. Following this, essential background healthy population studies must be performed, alongside observational, cross-sectional disease-based studies. Only then can the sensitivity and specificity of isotopic analyses be tested alongside currently employed methods, and important questions such as the influence of cancer heterogeneity and disease stage on isotopic signatures be addressed.

  2. Models and Simulations as a Service: Exploring the Use of Galaxy for Delivering Computational Models

    PubMed Central

    Walker, Mark A.; Madduri, Ravi; Rodriguez, Alex; Greenstein, Joseph L.; Winslow, Raimond L.

    2016-01-01

    We describe the ways in which Galaxy, a web-based reproducible research platform, can be used for web-based sharing of complex computational models. Galaxy allows users to seamlessly customize and run simulations on cloud computing resources, a concept we refer to as Models and Simulations as a Service (MaSS). To illustrate this application of Galaxy, we have developed a tool suite for simulating a high spatial-resolution model of the cardiac Ca2+ spark that requires supercomputing resources for execution. We also present tools for simulating models encoded in the SBML and CellML model description languages, thus demonstrating how Galaxy’s reproducible research features can be leveraged by existing technologies. Finally, we demonstrate how the Galaxy workflow editor can be used to compose integrative models from constituent submodules. This work represents an important novel approach, to our knowledge, to making computational simulations more accessible to the broader scientific community. PMID:26958881

  3. Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling

    NASA Astrophysics Data System (ADS)

    Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.

    2016-11-01

    Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.

  4. Culture and demography: from reluctant bedfellows to committed partners.

    PubMed

    Bachrach, Christine A

    2014-02-01

    Demography and culture have had a long but ambivalent relationship. Cultural influences are widely recognized as important for demographic outcomes but are often "backgrounded" in demographic research. I argue that progress toward a more successful integration is feasible and suggest a network model of culture as a potential tool. The network model bridges both traditional (holistic and institutional) and contemporary (tool kit) models of culture used in the social sciences and offers a simple vocabulary for a diverse set of cultural concepts, such as attitudes, beliefs, and norms, as well as quantitative measures of how culture is organized. The proposed model conceptualizes culture as a nested network of meanings represented by schemas that range in complexity from simple concepts to multifaceted cultural models. I illustrate the potential value of a model using accounts of the cultural changes underpinning the transformation of marriage in the United States and point to developments in the social, cognitive, and computational sciences that could facilitate the application of the model in empirical demographic research.

  5. Commentary on Two Classroom Observation Systems: Moving Toward a Shared Understanding of Effective Teaching

    PubMed Central

    Connor, Carol McDonald

    2016-01-01

    In this commentary, I make five points: that designing observation systems that actually predict students’ outcomes is challenging; second that systems that capture the complex and dynamic nature of the classroom learning environment are more likely to be able to meet this challenge; three, that observation tools are most useful when developed to serve a particular purpose and are put to that purpose; four that technology can help; and five, there are policy implications for valid and reliable classroom observation tools. The two observation systems presented in this special issue represent an important step forward and a move toward policy that promises to make a true difference in what is defined as high quality and effective teaching, what it looks like in the classroom, and how these practices can be more widely disseminated so that all children, including those attending under-resourced schools, can experience effective instruction, academic success and the lifelong accomplishment that follows. PMID:24341927

  6. Toward a view-oriented approach for aligning RDF-based biomedical repositories.

    PubMed

    Anguita, A; García-Remesal, M; de la Iglesia, D; Graf, N; Maojo, V

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". The need for complementary access to multiple RDF databases has fostered new lines of research, but also entailed new challenges due to data representation disparities. While several approaches for RDF-based database integration have been proposed, those focused on schema alignment have become the most widely adopted. All state-of-the-art solutions for aligning RDF-based sources resort to a simple technique inherited from legacy relational database integration methods. This technique - known as element-to-element (e2e) mappings - is based on establishing 1:1 mappings between single primitive elements - e.g. concepts, attributes, relationships, etc. - belonging to the source and target schemas. However, due to the intrinsic nature of RDF - a representation language based on defining tuples < subject, predicate, object > -, one may find RDF elements whose semantics vary dramatically when combined into a view involving other RDF elements - i.e. they depend on their context. The latter cannot be adequately represented in the target schema by resorting to the traditional e2e approach. These approaches fail to properly address this issue without explicitly modifying the target ontology, thus lacking the required expressiveness for properly reflecting the intended semantics in the alignment information. To enhance existing RDF schema alignment techniques by providing a mechanism to properly represent elements with context-dependent semantics, thus enabling users to perform more expressive alignments, including scenarios that cannot be adequately addressed by the existing approaches. Instead of establishing 1:1 correspondences between single primitive elements of the schemas, we propose adopting a view-based approach. The latter is targeted at establishing mapping relationships between RDF subgraphs - that can be regarded as the equivalent of views in traditional databases -, rather than between single schema elements. This approach enables users to represent scenarios defined by context-dependent RDF elements that cannot be properly represented when adopting the currently existing approaches. We developed a software tool implementing our view-based strategy. Our tool is currently being used in the context of the European Commission funded p-medicine project, targeted at creating a technological framework to integrate clinical and genomic data to facilitate the development of personalized drugs and therapies for cancer, based on the genetic profile of the patient. We used our tool to integrate different RDF-based databases - including different repositories of clinical trials and DICOM images - using the Health Data Ontology Trunk (HDOT) ontology as the target schema. The importance of database integration methods and tools in the context of biomedical research has been widely recognized. Modern research in this area - e.g. identification of disease biomarkers, or design of personalized therapies - heavily relies on the availability of a technical framework to enable researchers to uniformly access disparate repositories. We present a method and a tool that implement a novel alignment method specifically designed to support and enhance the integration of RDF-based data sources at schema (metadata) level. This approach provides an increased level of expressiveness compared to other existing solutions, and allows solving heterogeneity scenarios that cannot be properly represented using other state-of-the-art techniques.

  7. Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states

    PubMed Central

    Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.

    2012-01-01

    Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984

  8. GEOquery: a bridge between the Gene Expression Omnibus (GEO) and BioConductor.

    PubMed

    Davis, Sean; Meltzer, Paul S

    2007-07-15

    Microarray technology has become a standard molecular biology tool. Experimental data have been generated on a huge number of organisms, tissue types, treatment conditions and disease states. The Gene Expression Omnibus (Barrett et al., 2005), developed by the National Center for Bioinformatics (NCBI) at the National Institutes of Health is a repository of nearly 140,000 gene expression experiments. The BioConductor project (Gentleman et al., 2004) is an open-source and open-development software project built in the R statistical programming environment (R Development core Team, 2005) for the analysis and comprehension of genomic data. The tools contained in the BioConductor project represent many state-of-the-art methods for the analysis of microarray and genomics data. We have developed a software tool that allows access to the wealth of information within GEO directly from BioConductor, eliminating many the formatting and parsing problems that have made such analyses labor-intensive in the past. The software, called GEOquery, effectively establishes a bridge between GEO and BioConductor. Easy access to GEO data from BioConductor will likely lead to new analyses of GEO data using novel and rigorous statistical and bioinformatic tools. Facilitating analyses and meta-analyses of microarray data will increase the efficiency with which biologically important conclusions can be drawn from published genomic data. GEOquery is available as part of the BioConductor project.

  9. A Usability Study of Users’ Perceptions Toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    PubMed Central

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2015-01-01

    This usability study evaluated users’ perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption. Sixty-two study participants piloted the prototype and completed a usability questionnaire designed to measure two usability properties: program need and program applicability. Statistical analyses were used to test the hypothesis that the multimedia prototype was well designed and highly usable, it was perceived as: 1) highly needed across a spectrum of educational contexts, 2) highly applicable in supporting the pedagogical processes of teaching and learning neuroanatomy, and 3) was highly usable by all types of users. Three independent variables represented user differences: level of expertise (faculty vs. student), age, and gender. Analysis of the results supports the research hypotheses that the prototype was designed well for different types of users in various educational contexts and for supporting the pedagogy of neuroanatomy. In addition, the results suggest that the multimedia program will be most useful as a neuroanatomy review tool for health-professions students preparing for licensing or board exams. This study demonstrates the importance of integrating quality properties of usability with principles of human learning during the instructional design process for multimedia products. PMID:19177405

  10. Population Genetic Structure and Colonisation History of the Tool-Using New Caledonian Crow

    PubMed Central

    Abdelkrim, Jawad; Hunt, Gavin R.; Gray, Russell D.; Gemmell, Neil J.

    2012-01-01

    New Caledonian crows exhibit considerable variation in tool making between populations. Here, we present the first study of the species’ genetic structure over its geographical distribution. We collected feathers from crows on mainland Grande Terre, the inshore island of Toupéti, and the nearby island of Maré where it is believed birds were introduced after European colonisation. We used nine microsatellite markers to establish the genotypes of 136 crows from these islands and classical population genetic tools as well as Approximate Bayesian Computations to explore the distribution of genetic diversity. We found that New Caledonian crows most likely separate into three main distinct clusters: Grande Terre, Toupéti and Maré. Furthermore, Toupéti and Maré crows represent a subset of the genetic diversity observed on Grande Terre, confirming their mainland origin. The genetic data are compatible with a colonisation of Maré taking place after European colonisation around 1900. Importantly, we observed (1) moderate, but significant, genetic differentiation across Grande Terre, and (2) that the degree of differentiation between populations on the mainland increases with geographic distance. These data indicate that despite individual crows’ potential ability to disperse over large distances, most gene flow occurs over short distances. The temporal and spatial patterns described provide a basis for further hypothesis testing and investigation of the geographical variation observed in the tool skills of these crows. PMID:22590576

  11. Geocoded data structures and their applications to Earth science investigations

    NASA Technical Reports Server (NTRS)

    Goldberg, M.

    1984-01-01

    A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space.

  12. Spatial clustering of high load ocular Chlamydia trachomatis infection in trachoma: a cross-sectional population-based study.

    PubMed

    Last, Anna; Burr, Sarah; Alexander, Neal; Harding-Esch, Emma; Roberts, Chrissy H; Nabicassa, Meno; Cassama, Eunice Teixeira da Silva; Mabey, David; Holland, Martin; Bailey, Robin

    2017-07-31

    Chlamydia trachomatis (Ct) is the most common cause of bacterial sexually transmitted infection and infectious cause of blindness (trachoma) worldwide. Understanding the spatial distribution of Ct infection may enable us to identify populations at risk and improve our understanding of Ct transmission. In this study, we sought to investigate the spatial distribution of Ct infection and the clinical features associated with high Ct load in trachoma-endemic communities on the Bijagós Archipelago (Guinea Bissau). We collected 1507 conjunctival samples and corresponding detailed clinical data during a cross-sectional population-based geospatially representative trachoma survey. We used droplet digital PCR to estimate Ct load on conjunctival swabs. Geostatistical tools were used to investigate clustering of ocular Ct infections. Spatial clusters (independent of age and gender) of individuals with high Ct loads were identified using local indicators of spatial association. We did not detect clustering of individuals with low load infections. These data suggest that infections with high bacterial load may be important in Ct transmission. These geospatial tools may be useful in the study of ocular Ct transmission dynamics and as part of trachoma surveillance post-treatment, to identify clusters of infection and thresholds of Ct load that may be important foci of re-emergent infection in communities. © FEMS 2017.

  13. Realising the Potential of Urine and Saliva as Diagnostic Tools in Sport and Exercise Medicine.

    PubMed

    Lindsay, Angus; Costello, Joseph T

    2017-01-01

    Accurate monitoring of homeostatic perturbations following various psychophysiological stressors is essential in sports and exercise medicine. Various biomarkers are routinely used as monitoring tools in both clinical and elite sport settings. Blood collection and muscle biopsies, both invasive in nature, are considered the gold standard for the analysis of these biomarkers in exercise science. Exploring non-invasive methods of collecting and analysing biomarkers that are capable of providing accurate information regarding exercise-induced physiological and psychological stress is of obvious practical importance. This review describes the potential benefits, and the limitations, of using saliva and urine to ascertain biomarkers capable of identifying important stressors that are routinely encountered before, during, or after intense or unaccustomed exercise, competition, over-training, and inappropriate recovery. In particular, we focus on urinary and saliva biomarkers that have previously been used to monitor muscle damage, inflammation, cardiovascular stress, oxidative stress, hydration status, and brain distress. Evidence is provided from a range of empirical studies suggesting that urine and saliva are both capable of identifying various stressors. Although additional research regarding the efficacy of using urine and/or saliva to indicate the severity of exercise-induced psychophysiological stress is required, it is likely that these non-invasive biomarkers will represent "the future" in sports and exercise medicine.

  14. A diagnostic approach for cerebral palsy in the genomic era.

    PubMed

    Lee, Ryan W; Poretti, Andrea; Cohen, Julie S; Levey, Eric; Gwynn, Hilary; Johnston, Michael V; Hoon, Alexander H; Fatemi, Ali

    2014-12-01

    An ongoing challenge in children presenting with motor delay/impairment early in life is to identify neurogenetic disorders with a clinical phenotype, which can be misdiagnosed as cerebral palsy (CP). To help distinguish patients in these two groups, conventional magnetic resonance imaging of the brain has been of great benefit in "unmasking" many of these genetic etiologies and has provided important clues to differential diagnosis in others. Recent advances in molecular genetics such as chromosomal microarray and next-generation sequencing have further revolutionized the understanding of etiology by more precisely classifying these disorders with a molecular cause. In this paper, we present a review of neurogenetic disorders masquerading as cerebral palsy evaluated at one institution. We have included representative case examples children presenting with dyskinetic, spastic, and ataxic phenotypes, with the intent to highlight the time-honored approach of using clinical tools of history and examination to focus the subsequent etiologic search with advanced neuroimaging modalities and molecular genetic tools. A precise diagnosis of these masqueraders and their differentiation from CP is important in terms of therapy, prognosis, and family counseling. In summary, this review serves as a continued call to remain vigilant for current and other to-be-discovered neurogenetic masqueraders of cerebral palsy, thereby optimizing care for patients and their families.

  15. A Diagnostic Approach for Cerebral Palsy in the Genomic Era

    PubMed Central

    Lee, Ryan W.; Poretti, Andrea; Cohen, Julie S.; Levey, Eric; Gwynn, Hilary; Johnston, Michael V.; Hoon, Alexander H.; Fatemi, Ali

    2014-01-01

    An ongoing challenge in children presenting with motor delay/impairment early in life is to identify neurogenetic disorders with a clinical phenotype which can be misdiagnosed as cerebral palsy (CP). To help distinguish patients in these two groups, conventional magnetic resonance imaging (MRI) of the brain has been of great benefit in “unmasking” many of these genetic etiologies and has provided important clues to differential diagnosis in others. Recent advances in molecular genetics such as chromosomal microarray and next generation sequencing have further revolutionized the understanding of etiology by more precisely classifying these disorders with a molecular cause. In this paper, we present a review of neurogenetic disorders masquerading as cerebral palsy evaluated at one institution. We have included representative case examples children presenting with dyskinetic, spastic and ataxic phenotypes, with the intent to highlight the time honored approach of using clinical tools of history and examination to focus the subsequent etiologic search with advanced neuroimaging modalities and molecular genetic tools. A precise diagnosis of these masqueraders and their differentiation from CP is important in terms of therapy, prognosis, and family counseling. In summary, this review serves as a continued call to remain vigilant for current and other to-be-discovered neurogenetic masqueraders of cerebral palsy, thereby optimizing care for patients and their families. PMID:25280894

  16. Motor-Iconicity of Sign Language Does Not Alter the Neural Systems Underlying Tool and Action Naming

    ERIC Educational Resources Information Center

    Emmorey, Karen; Grabowski, Thomas; McCullough, Stephen; Damasio, Hannah; Ponto, Laurie; Hichwa, Richard; Bellugi, Ursula

    2004-01-01

    Positron emission tomography was used to investigate whether the motor-iconic basis of certain forms in American Sign Language (ASL) partially alters the neural systems engaged during lexical retrieval. Most ASL nouns denoting tools and ASL verbs referring to tool-based actions are produced with a handshape representing the human hand holding a…

  17. Wiki Tools in the Preparation and Support of e-Learning Courses

    ERIC Educational Resources Information Center

    Jancarik, Antonin; Jancarikova, Katerina

    2010-01-01

    Wiki tools, which became known mainly thanks to the Wikipedia encyclopedia, represent quite a new phenomenon on the Internet. The work presented here deals with three areas connected to a possible use of wiki tools for the preparation of an e-learning course. To what extent does Wikipedia.com contain terms necessary for scientific lectures at the…

  18. Terrace Layout Using a Computer Assisted System

    USDA-ARS?s Scientific Manuscript database

    Development of a web-based terrace design tool based on the MOTERR program is presented, along with representative layouts for conventional and parallel terrace systems. Using digital elevation maps and geographic information systems (GIS), this tool utilizes personal computers to rapidly construct ...

  19. Influence diagrams as oil spill decision science tools

    EPA Science Inventory

    Making inferences on risks to ecosystem services (ES) from ecological crises can be more reliably handled using decision science tools. Influence diagrams (IDs) are probabilistic networks that explicitly represent the decisions related to a problem and evidence of their influence...

  20. Respiratory mechanics to understand ARDS and guide mechanical ventilation.

    PubMed

    Mauri, Tommaso; Lazzeri, Marta; Bellani, Giacomo; Zanella, Alberto; Grasselli, Giacomo

    2017-11-30

    As precision medicine is becoming a standard of care in selecting tailored rather than average treatments, physiological measurements might represent the first step in applying personalized therapy in the intensive care unit (ICU). A systematic assessment of respiratory mechanics in patients with the acute respiratory distress syndrome (ARDS) could represent a step in this direction, for two main reasons. Approach and Main results: On the one hand, respiratory mechanics are a powerful physiological method to understand the severity of this syndrome in each single patient. Decreased respiratory system compliance, for example, is associated with low end expiratory lung volume and more severe lung injury. On the other hand, respiratory mechanics might guide protective mechanical ventilation settings. Improved gravitationally dependent regional lung compliance could support the selection of positive end-expiratory pressure and maximize alveolar recruitment. Moreover, the association between driving airway pressure and mortality in ARDS patients potentially underlines the importance of sizing tidal volume on respiratory system compliance rather than on predicted body weight. The present review article aims to describe the main alterations of respiratory mechanics in ARDS as a potent bedside tool to understand severity and guide mechanical ventilation settings, thus representing a readily available clinical resource for ICU physicians.

  1. Promoting innovative business in the fishery sector in West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Nurhayati, A.; Aisah, I.; Supriatna, A. K.

    2018-04-01

    West Java represents an important fisheries production center in Indonesia, owing to the abundant capture and aquaculture resources. However, the intrinsic characteristics of fish products such as perishable, voluminous, and seasonal currently prevent fisheries from having brought significant economic contribution to the province. In line with it, this research was aimed to analyze and identify leverage factors that will lead to fishery-based innovative business in West Java. Data used in this research were primary and secondary ones, which were collected through surveys involving 30 respondents representing fish processors and the same number representing consumers. A Focus Group Discussion (FGD) was also carried out to verify the collected data. Analytical tools adopted in this research were fishery triangle product model. Based on the analyses, it was found factors influencing the success of a fishery innovative business in West Java, Indonesia were consecutively: the existence of derivative products, product processing innovativeness, product price competitiveness, market place, and promotion. Based on the fishery trianggle product model, it was found that fish onboard handling, post-harvest handling, and procesing was in the development stage and therefore these production nodes need a particularly high attention.

  2. Assembling networks of microbial genomes using linear programming.

    PubMed

    Holloway, Catherine; Beiko, Robert G

    2010-11-20

    Microbial genomes exhibit complex sets of genetic affinities due to lateral genetic transfer. Assessing the relative contributions of parent-to-offspring inheritance and gene sharing is a vital step in understanding the evolutionary origins and modern-day function of an organism, but recovering and showing these relationships is a challenging problem. We have developed a new approach that uses linear programming to find between-genome relationships, by treating tables of genetic affinities (here, represented by transformed BLAST e-values) as an optimization problem. Validation trials on simulated data demonstrate the effectiveness of the approach in recovering and representing vertical and lateral relationships among genomes. Application of the technique to a set comprising Aquifex aeolicus and 75 other thermophiles showed an important role for large genomes as 'hubs' in the gene sharing network, and suggested that genes are preferentially shared between organisms with similar optimal growth temperatures. We were also able to discover distinct and common genetic contributors to each sequenced representative of genus Pseudomonas. The linear programming approach we have developed can serve as an effective inference tool in its own right, and can be an efficient first step in a more-intensive phylogenomic analysis.

  3. Speech and language support: How physicians can identify and treat speech and language delays in the office setting.

    PubMed

    Moharir, Madhavi; Barnett, Noel; Taras, Jillian; Cole, Martha; Ford-Jones, E Lee; Levin, Leo

    2014-01-01

    Failure to recognize and intervene early in speech and language delays can lead to multifaceted and potentially severe consequences for early child development and later literacy skills. While routine evaluations of speech and language during well-child visits are recommended, there is no standardized (office) approach to facilitate this. Furthermore, extensive wait times for speech and language pathology consultation represent valuable lost time for the child and family. Using speech and language expertise, and paediatric collaboration, key content for an office-based tool was developed. early and accurate identification of speech and language delays as well as children at risk for literacy challenges; appropriate referral to speech and language services when required; and teaching and, thus, empowering parents to create rich and responsive language environments at home. Using this tool, in combination with the Canadian Paediatric Society's Read, Speak, Sing and Grow Literacy Initiative, physicians will be better positioned to offer practical strategies to caregivers to enhance children's speech and language capabilities. The tool represents a strategy to evaluate speech and language delays. It depicts age-specific linguistic/phonetic milestones and suggests interventions. The tool represents a practical interim treatment while the family is waiting for formal speech and language therapy consultation.

  4. enoLOGOS: a versatile web tool for energy normalized sequence logos

    PubMed Central

    Workman, Christopher T.; Yin, Yutong; Corcoran, David L.; Ideker, Trey; Stormo, Gary D.; Benos, Panayiotis V.

    2005-01-01

    enoLOGOS is a web-based tool that generates sequence logos from various input sources. Sequence logos have become a popular way to graphically represent DNA and amino acid sequence patterns from a set of aligned sequences. Each position of the alignment is represented by a column of stacked symbols with its total height reflecting the information content in this position. Currently, the available web servers are able to create logo images from a set of aligned sequences, but none of them generates weighted sequence logos directly from energy measurements or other sources. With the advent of high-throughput technologies for estimating the contact energy of different DNA sequences, tools that can create logos directly from binding affinity data are useful to researchers. enoLOGOS generates sequence logos from a variety of input data, including energy measurements, probability matrices, alignment matrices, count matrices and aligned sequences. Furthermore, enoLOGOS can represent the mutual information of different positions of the consensus sequence, a unique feature of this tool. Another web interface for our software, C2H2-enoLOGOS, generates logos for the DNA-binding preferences of the C2H2 zinc-finger transcription factor family members. enoLOGOS and C2H2-enoLOGOS are accessible over the web at . PMID:15980495

  5. Identifying Understudied Nuclear Reactions by Text-mining the EXFOR Experimental Nuclear Reaction Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirdt, J.A.; Brown, D.A., E-mail: dbrown@bnl.gov

    The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of socialmore » networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.« less

  6. New Aspects of Gene-Silencing for the Treatment of Cardiovascular Diseases

    PubMed Central

    Koenig, Olivia; Walker, Tobias; Perle, Nadja; Zech, Almuth; Neumann, Bernd; Schlensak, Christian; Wendel, Hans-Peter; Nolte, Andrea

    2013-01-01

    Coronary heart disease (CHD), mainly caused by atherosclerosis, represents the single leading cause of death in industrialized countries. Besides the classical interventional therapies new applications for treatment of vascular wall pathologies are appearing on the horizon. RNA interference (RNAi) represents a novel therapeutic strategy due to sequence-specific gene-silencing through the use of small interfering RNA (siRNA). The modulation of gene expression by short RNAs provides a powerful tool to theoretically silence any disease-related or disease-promoting gene of interest. In this review we outline the RNAi mechanisms, the currently used delivery systems and their possible applications to the cardiovascular system. Especially, the optimization of the targeting and transfection procedures could enhance the efficiency of siRNA delivery drastically and might open the way to clinical applicability. The new findings of the last years may show the techniques to new innovative therapies and could probably play an important role in treating CHD in the future. PMID:24276320

  7. Transcriptomic resources for environmental risk assessment: a case study in the Venice lagoon.

    PubMed

    Milan, M; Pauletto, M; Boffo, L; Carrer, C; Sorrentino, F; Ferrari, G; Pavan, L; Patarnello, T; Bargelloni, L

    2015-02-01

    The development of new resources to evaluate the environmental status is becoming increasingly important representing a key challenge for ocean and coastal management. Recently, the employment of transcriptomics in aquatic toxicology has led to increasing initiatives proposing to integrate eco-toxicogenomics in the evaluation of marine ecosystem health. However, several technical issues need to be addressed before introducing genomics as a reliable tool in regulatory ecotoxicology. The Venice lagoon constitutes an excellent case, in which the assessment of environmental risks derived from the nearby industrial activities represents a crucial task. In this context, the potential role of genomics to assist environmental monitoring was investigated through the definition of reliable gene expression markers associated to chemical contamination in Manila clams, and their subsequent employment for the classification of Venice lagoon areas. Overall, the present study addresses key issues to evaluate the future outlooks of genomics in the environmental monitoring and risk assessment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Late-Onset Hepatic Veno-Occlusive Disease after Allografting: Report of Two Cases with Atypical Clinical Features Successfully Treated with Defibrotide.

    PubMed

    Castellino, Alessia; Guidi, Stefano; Dellacasa, Chiara Maria; Gozzini, Antonella; Donnini, Irene; Nozzoli, Chiara; Manetta, Sara; Aydin, Semra; Giaccone, Luisa; Festuccia, Moreno; Brunello, Lucia; Maffini, Enrico; Bruno, Benedetto; David, Ezio; Busca, Alessandro

    2018-01-01

    Hepatic Veno-Occlusive Disease (VOD) is a potentially severe complication of hematopoietic stem cell transplantation (HSCT). Here we report two patients receiving an allogeneic HSCT who developed late onset VOD with atypical clinical features. The two patients presented with only few risk factors, namely, advanced acute leukemia, a myeloablative busulphan-containing regimen and received grafts from an unrelated donor. The first patient did not experience painful hepatomegaly and weight gain and both patients showed only a mild elevation in total serum bilirubin level. Most importantly, the two patients developed clinical signs beyond day 21 post-HSCT. Hepatic transjugular biopsy confirmed the diagnosis of VOD. Intravenous defibrotide was promptly started leading to a marked clinical improvement. Based on our experience, liver biopsy may represent a useful diagnostic tool when the clinical features of VOD are ambiguous. Early therapeutic intervention with defibrotide represents a crucial issue for the successful outcome of patients with VOD.

  9. Late-Onset Hepatic Veno-Occlusive Disease after Allografting: Report of Two Cases with Atypical Clinical Features Successfully Treated with Defibrotide

    PubMed Central

    Castellino, Alessia; Guidi, Stefano; Dellacasa, Chiara Maria; Gozzini, Antonella; Donnini, Irene; Nozzoli, Chiara; Manetta, Sara; Aydin, Semra; Giaccone, Luisa; Festuccia, Moreno; Brunello, Lucia; Maffini, Enrico; Bruno, Benedetto; David, Ezio; Busca, Alessandro

    2018-01-01

    Hepatic Veno-Occlusive Disease (VOD) is a potentially severe complication of hematopoietic stem cell transplantation (HSCT). Here we report two patients receiving an allogeneic HSCT who developed late onset VOD with atypical clinical features. The two patients presented with only few risk factors, namely, advanced acute leukemia, a myeloablative busulphan-containing regimen and received grafts from an unrelated donor. The first patient did not experience painful hepatomegaly and weight gain and both patients showed only a mild elevation in total serum bilirubin level. Most importantly, the two patients developed clinical signs beyond day 21 post-HSCT. Hepatic transjugular biopsy confirmed the diagnosis of VOD. Intravenous defibrotide was promptly started leading to a marked clinical improvement. Based on our experience, liver biopsy may represent a useful diagnostic tool when the clinical features of VOD are ambiguous. Early therapeutic intervention with defibrotide represents a crucial issue for the successful outcome of patients with VOD. PMID:29326798

  10. Representing causal knowledge in environmental policy interventions: Advantages and opportunities for qualitative influence diagram applications.

    PubMed

    Carriger, John F; Dyson, Brian E; Benson, William H

    2018-01-15

    This article develops and explores a methodology for using qualitative influence diagrams in environmental policy and management to support decision making efforts that minimize risk and increase resiliency. Influence diagrams are representations of the conditional aspects of a problem domain. Their graphical properties are useful for structuring causal knowledge relevant to policy interventions and can be used to enhance inference and inclusivity of multiple viewpoints. Qualitative components of influence diagrams are beneficial tools for identifying and examining the interactions among the critical variables in complex policy development and implementation. Policy interventions on social-environmental systems can be intuitively diagrammed for representing knowledge of critical relationships among economic, environmental, and social attributes. Examples relevant to coastal resiliency issues in the U.S. Gulf Coast region are developed to illustrate model structures for developing qualitative influence diagrams useful for clarifying important policy intervention issues and enhancing transparency in decision making. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Effective, homogeneous and transient interference with cytosine methylation in plant genomic DNA by zebularine

    PubMed Central

    Baubec, Tuncay; Pecinka, Ales; Rozhon, Wilfried; Mittelsten Scheid, Ortrun

    2009-01-01

    Covalent modification by methylation of cytosine residues represents an important epigenetic hallmark. While sequence analysis after bisulphite conversion allows correlative analyses with single-base resolution, functional analysis by interference with DNA methylation is less precise, due to the complexity of methylation enzymes and their targets. A cytidine analogue, 5-azacytidine, is frequently used as an inhibitor of DNA methyltransferases, but its rapid degradation in aqueous solution is problematic for culture periods of longer than a few hours. Application of zebularine, a more stable cytidine analogue with a similar mode of action that is successfully used as a methylation inhibitor in Neurospora and mammalian tumour cell lines, can significantly reduce DNA methylation in plants in a dose-dependent and transient manner independent of sequence context. Demethylation is connected with transcriptional reactivation and partial decondensation of heterochromatin. Zebularine represents a promising new and versatile tool for investigating the role of DNA methylation in plants with regard to transcriptional control, maintenance and formation of (hetero-) chromatin. PMID:18826433

  12. Identifying Understudied Nuclear Reactions by Text-mining the EXFOR Experimental Nuclear Reaction Library

    NASA Astrophysics Data System (ADS)

    Hirdt, J. A.; Brown, D. A.

    2016-01-01

    The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of social networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.

  13. Automated Performance Prediction of Message-Passing Parallel Programs

    NASA Technical Reports Server (NTRS)

    Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)

    1995-01-01

    The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.

  14. Searching RNA motifs and their intermolecular contacts with constraint networks.

    PubMed

    Thébault, P; de Givry, S; Schiex, T; Gaspin, C

    2006-09-01

    Searching RNA gene occurrences in genomic sequences is a task whose importance has been renewed by the recent discovery of numerous functional RNA, often interacting with other ligands. Even if several programs exist for RNA motif search, none exists that can represent and solve the problem of searching for occurrences of RNA motifs in interaction with other molecules. We present a constraint network formulation of this problem. RNA are represented as structured motifs that can occur on more than one sequence and which are related together by possible hybridization. The implemented tool MilPat is used to search for several sRNA families in genomic sequences. Results show that MilPat allows to efficiently search for interacting motifs in large genomic sequences and offers a simple and extensible framework to solve such problems. New and known sRNA are identified as H/ACA candidates in Methanocaldococcus jannaschii. http://carlit.toulouse.inra.fr/MilPaT/MilPat.pl.

  15. [D.lgsl. 625/1994--Protection against carcinogenic agents: the obligation to educate].

    PubMed

    Fucci, P; Anselmi, E; Bracci, C; Comba, P

    1997-01-01

    According to act 626/1994, employers have the duty to inform and train workers and their representatives. The implementation of training activities requires the following points: planning the training program according to the needs of the target population, use of the methods aimed at promoting learning and the adoption of safe behaviour, setting-up of evaluation tools. The disciplines of risk perception and communication and adult training may provide useful contribution in this frame. At the light of the preliminary experiences in this field, the importance of the following items for workers, workers representatives and employers is emphasized: probabilistic causality models, role of cognitive and emotional factors in the learning process, definition of carcinogenic according to national and international organisation, meaning of TLV with respect to carcinogenic exposure, interaction between carcinogens in the case of multiple exposition, risk evaluation, preventive measures, transfer of carcinogen risk from workplace to domestic environment, due to lack of compliance with basic hygienic rules such proper use of work clothes.

  16. Sperm Cell Population Dynamics in Ram Semen during the Cryopreservation Process

    PubMed Central

    Ramón, Manuel; Pérez-Guzmán, M. Dolores; Jiménez-Rabadán, Pilar; Esteso, Milagros C.; García-Álvarez, Olga; Maroto-Morales, Alejandro; Anel-López, Luis; Soler, Ana J.; Fernández-Santos, M. Rocío; Garde, J. Julián

    2013-01-01

    Background Sperm cryopreservation has become an indispensable tool in biology. Initially, studies were aimed towards the development of efficient freezing protocols in different species that would allow for an efficient storage of semen samples for long periods of time, ensuring its viability. Nowadays, it is widely known that an important individual component exists in the cryoresistance of semen, and efforts are aimed at identifying those sperm characteristics that may allow us to predict this cryoresistance. This knowledge would lead, ultimately, to the design of optimized freezing protocols for the sperm characteristics of each male. Methodology/Principal Findings We have evaluated the changes that occur in the sperm head dimensions throughout the cryopreservation process. We have found three different patterns of response, each of one related to a different sperm quality at thawing. We have been able to characterize males based on these patterns. For each male, its pattern remained constant among different ejaculates. This latter would imply that males always respond in the same way to freezing, giving even more importance to this sperm feature. Conclusions/Significance Changes in the sperm head during cryopreservation process have resulted useful to identify the ability of semen of males for freezing. We suggest that analyses of these response patterns would represent an important tool to characterize the cryoresistance of males when implemented within breeding programs. We also propose follow-up experiments to examine the outcomes of the use of different freezing protocols depending on the pattern of response of males. PMID:23544054

  17. Development of an Efficient Genome Editing Tool in Bacillus licheniformis Using CRISPR-Cas9 Nickase.

    PubMed

    Li, Kaifeng; Cai, Dongbo; Wang, Zhangqian; He, Zhili; Chen, Shouwen

    2018-03-15

    Bacillus strains are important industrial bacteria that can produce various biochemical products. However, low transformation efficiencies and a lack of effective genome editing tools have hindered its widespread application. Recently, clustered regularly interspaced short palindromic repeat (CRISPR)-Cas9 techniques have been utilized in many organisms as genome editing tools because of their high efficiency and easy manipulation. In this study, an efficient genome editing method was developed for Bacillus licheniformis using a CRISPR-Cas9 nickase integrated into the genome of B. licheniformis DW2 with overexpression driven by the P43 promoter. The yvmC gene was deleted using the CRISPR-Cas9n technique with homology arms of 1.0 kb as a representative example, and an efficiency of 100% was achieved. In addition, two genes were simultaneously disrupted with an efficiency of 11.6%, and the large DNA fragment bacABC (42.7 kb) was deleted with an efficiency of 79.0%. Furthermore, the heterologous reporter gene aprN , which codes for nattokinase in Bacillus subtilis , was inserted into the chromosome of B. licheniformis with an efficiency of 76.5%. The activity of nattokinase in the DWc9nΔ7/pP43SNT-S sacC strain reached 59.7 fibrinolytic units (FU)/ml, which was 25.7% higher than that of DWc9n/pP43SNT-S sacC Finally, the engineered strain DWc9nΔ7 (Δ epr Δ wprA Δ mpr Δ aprE Δ vpr Δ bprA Δ bacABC ), with multiple disrupted genes, was constructed using the CRISPR-Cas9n technique. Taken together, we have developed an efficient genome editing tool based on CRISPR-Cas9n in B. licheniformis This tool could be applied to strain improvement for future research. IMPORTANCE As important industrial bacteria, Bacillus strains have attracted significant attention due to their production of biological products. However, genetic manipulation of these bacteria is difficult. The CRISPR-Cas9 system has been applied to genome editing in some bacteria, and CRISPR-Cas9n was proven to be an efficient and precise tool in previous reports. The significance of our research is the development of an efficient, more precise, and systematic genome editing method for single-gene deletion, multiple-gene disruption, large DNA fragment deletion, and single-gene integration in Bacillus licheniformis via Cas9 nickase. We also applied this method to the genetic engineering of the host strain for protein expression. Copyright © 2018 American Society for Microbiology.

  18. Visual analysis of large heterogeneous social networks by semantic and structural abstraction.

    PubMed

    Shen, Zeqian; Ma, Kwan-Liu; Eliassi-Rad, Tina

    2006-01-01

    Social network analysis is an active area of study beyond sociology. It uncovers the invisible relationships between actors in a network and provides understanding of social processes and behaviors. It has become an important technique in a variety of application areas such as the Web, organizational studies, and homeland security. This paper presents a visual analytics tool, OntoVis, for understanding large, heterogeneous social networks, in which nodes and links could represent different concepts and relations, respectively. These concepts and relations are related through an ontology (also known as a schema). OntoVis is named such because it uses information in the ontology associated with a social network to semantically prune a large, heterogeneous network. In addition to semantic abstraction, OntoVis also allows users to do structural abstraction and importance filtering to make large networks manageable and to facilitate analytic reasoning. All these unique capabilities of OntoVis are illustrated with several case studies.

  19. Tobacco point‐of‐purchase promotion: examining tobacco industry documents

    PubMed Central

    Lavack, Anne M; Toth, Graham

    2006-01-01

    In the face of increasing media restrictions around the world, point‐of‐purchase promotion (also called point‐of‐sale merchandising, and frequently abbreviated as POP or POS) is now one of the most important tools that tobacco companies have for promoting tobacco products. Using tobacco industry documents, this paper demonstrates that tobacco companies have used point‐of‐purchase promotion in response to real or anticipated advertising restrictions. Their goal was to secure dominance in the retail setting, and this was achieved through well‐trained sales representatives who offered contracts for promotional incentive programmes to retailers, which included the use of point‐of‐sale displays and merchandising fixtures. Audit programmes played an important role in ensuring contract enforcement and compliance with a variety of tobacco company incentive programmes. Tobacco companies celebrated their merchandising successes, in recognition of the stiff competition that existed among tobacco companies for valuable retail display space. PMID:16998172

  20. Tobacco point-of-purchase promotion: examining tobacco industry documents.

    PubMed

    Lavack, Anne M; Toth, Graham

    2006-10-01

    In the face of increasing media restrictions around the world, point-of-purchase promotion (also called point-of-sale merchandising, and frequently abbreviated as POP or POS) is now one of the most important tools that tobacco companies have for promoting tobacco products. Using tobacco industry documents, this paper demonstrates that tobacco companies have used point-of-purchase promotion in response to real or anticipated advertising restrictions. Their goal was to secure dominance in the retail setting, and this was achieved through well-trained sales representatives who offered contracts for promotional incentive programmes to retailers, which included the use of point-of-sale displays and merchandising fixtures. Audit programmes played an important role in ensuring contract enforcement and compliance with a variety of tobacco company incentive programmes. Tobacco companies celebrated their merchandising successes, in recognition of the stiff competition that existed among tobacco companies for valuable retail display space.

  1. Psychosocial issues in space: future challenges.

    PubMed

    Sandal, G M

    2001-06-01

    As the duration of space flights increases and crews become more heterogeneous, psychosocial factors are likely to play an increasingly important role in determining mission success. The operations of the International Space Station and planning of interplanetary missions represent important future challenges for how to select, train and monitor crews. So far, empirical evidence about psychological factors in space is based on simulations and personnel in analog environments (i.e. polar expeditions, submarines). It is apparent that attempts to transfer from these environments to space requires a thorough analysis of the human behavior specific to the fields. Recommendations for research include the effects of multi-nationality on crew interaction, development of tension within crews and between Mission Control, and prediction of critical phases in adaptation over time. Selection of interpersonally compatible crews, pre-mission team training and implementation of tools for self-monitoring of psychological parameters ensure that changes in mission requirements maximize crew performance.

  2. Emerging Agricultural Biotechnologies for Sustainable Agriculture and Food Security.

    PubMed

    Anderson, Jennifer A; Gipmans, Martijn; Hurst, Susan; Layton, Raymond; Nehra, Narender; Pickett, John; Shah, Dilip M; Souza, Thiago Lívio P O; Tripathi, Leena

    2016-01-20

    As global populations continue to increase, agricultural productivity will be challenged to keep pace without overtaxing important environmental resources. A dynamic and integrated approach will be required to solve global food insecurity and position agriculture on a trajectory toward sustainability. Genetically modified (GM) crops enhanced through modern biotechnology represent an important set of tools that can promote sustainable agriculture and improve food security. Several emerging biotechnology approaches were discussed in a recent symposium organized at the 13th IUPAC International Congress of Pesticide Chemistry meeting in San Francisco, CA, USA. This paper summarizes the innovative research and several of the new and emerging technologies within the field of agricultural biotechnology that were presented during the symposium. This discussion highlights how agricultural biotechnology fits within the context of sustainable agriculture and improved food security and can be used in support of further development and adoption of beneficial GM crops.

  3. Introduction to the use of regression models in epidemiology.

    PubMed

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  4. Three-dimensional Finite Element Modelling of Composite Slabs for High Speed Rails

    NASA Astrophysics Data System (ADS)

    Mlilo, Nhlanganiso; Kaewunruen, Sakdirat

    2017-12-01

    Currently precast steel-concrete composite slabs are being considered on railway bridges as a viable alternative replacement for timber sleepers. However, due to their nature and the loading conditions, their behaviour is often complex. Present knowledge of the behaviour of precast steel-concrete composite slabs subjected to rail loading is limited. FEA is an important tool used to simulate real life behaviour and is widely accepted in many disciples of engineering as an alternative to experimental test methods, which are often costly and time consuming. This paper seeks to detail FEM of precast steel-concrete slabs subjected to standard in-service loading in high-speed rail with focus on the importance of accurately defining material properties, element type, mesh size, contacts, interactions and boundary conditions that will give results representative of real life behaviour. Initial finite element model show very good results, confirming the accuracy of the modelling procedure

  5. Web-based learning resources - new opportunities for competency development.

    PubMed

    Moen, Anne; Nygård, Kathrine A; Gauperaa, Torunn

    2009-01-01

    Creating web-based learning environments holds great promise for on the job training and competence development in nursing. The web-based learning environment was designed and customized by four professional development nurses. We interviewed five RNs that pilot tested the web-based resource. Our findings give some insight into how the web-based design tool are perceived and utilized, and how content is represented in the learning environment. From a competency development perspective, practicing authentic tasks in a web-based learning environment can be useful to train skills and keep up important routines. The approach found in this study also needs careful consideration. Emphasizing routines and skills can be important to reduce variation and ensure more streamlined practice from an institution-wide quality improvement efforts. How the emphasis on routines and skills plays out towards the individual's overall professional development needs further careful studies.

  6. Growth assessment in diagnosis of Fetal Growth Restriction. Review.

    PubMed

    Albu, A R; Horhoianu, I A; Dumitrascu, M C; Horhoianu, V

    2014-06-15

    The assessment of fetal growth represents a fundamental step towards the identification of the true growth restricted fetus that is associated to important perinatal morbidity and mortality. The possible ways of detecting abnormal fetal growth are taken into consideration in this review and their strong and weak points are discussed. An important debate still remains about how to discriminate between the physiologically small fetus that does not require special surveillance and the truly growth restricted fetus who is predisposed to perinatal complications, even if its parameters are above the cut-off limits established. In this article, we present the clinical tools of fetal growth assessment: Symphyseal-Fundal Height (SFH) measurement, the fetal ultrasound parameters widely taken into consideration when discussing fetal growth: Abdominal Circumference (AC) and Estimated Fetal Weight (EFW); several types of growth charts and their characteristics: populational growth charts, standard growth charts, individualized growth charts, customized growth charts and growth trajectories.

  7. Host and Toxoplasma gondii genetic and non-genetic factors influencing the development of ocular toxoplasmosis: A systematic review.

    PubMed

    Fernández, Carolina; Jaimes, Jesús; Ortiz, María Camila; Ramírez, Juan David

    2016-10-01

    Toxoplasmosis is a cosmopolitan infection caused by the apicomplexan parasite Toxoplasma gondii. This infectious disease is widely distributed across the world where cats play an important role in its spread. The symptomatology caused by this parasite is diverse but the ocular affectation emerges as the most important clinical phenotype. Therefore, we conducted a systematic review of the current knowledge of ocular toxoplasmosis from the genetic diversity of the pathogen towards the treatment available for this infection. This review represents an update to the scientific community regarding the genetic diversity of the parasite, the genetic factors of the host, the molecular pathogenesis and its association with disease, the available diagnostic tools and the available treatment of patients undergoing ocular toxoplamosis. This review will be an update for the scientific community in order to encourage researchers to deploy cutting-edge investigation across this field.

  8. The sweet tooth of biopharmaceuticals: importance of recombinant protein glycosylation analysis.

    PubMed

    Lingg, Nico; Zhang, Peiqing; Song, Zhiwei; Bardor, Muriel

    2012-12-01

    Biopharmaceuticals currently represent the fastest growing sector of the pharmaceutical industry, mainly driven by a rapid expansion in the manufacture of recombinant protein-based drugs. Glycosylation is the most prominent post-translational modification occurring on these protein drugs. It constitutes one of the critical quality attributes that requires thorough analysis for optimal efficacy and safety. This review examines the functional importance of glycosylation of recombinant protein drugs, illustrated using three examples of protein biopharmaceuticals: IgG antibodies, erythropoietin and glucocerebrosidase. Current analytical methods are reviewed as solutions for qualitative and quantitative measurements of glycosylation to monitor quality target product profiles of recombinant glycoprotein drugs. Finally, we propose a framework for designing the quality target product profile of recombinant glycoproteins and planning workflow for glycosylation analysis with the selection of available analytical methods and tools. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. The Representation of Object-Directed Action and Function Knowledge in the Human Brain

    PubMed Central

    Chen, Quanjing; Garcea, Frank E.; Mahon, Bradford Z.

    2016-01-01

    The appropriate use of everyday objects requires the integration of action and function knowledge. Previous research suggests that action knowledge is represented in frontoparietal areas while function knowledge is represented in temporal lobe regions. Here we used multivoxel pattern analysis to investigate the representation of object-directed action and function knowledge while participants executed pantomimes of familiar tool actions. A novel approach for decoding object knowledge was used in which classifiers were trained on one pair of objects and then tested on a distinct pair; this permitted a measurement of classification accuracy over and above object-specific information. Region of interest (ROI) analyses showed that object-directed actions could be decoded in tool-preferring regions of both parietal and temporal cortex, while no independently defined tool-preferring ROI showed successful decoding of object function. However, a whole-brain searchlight analysis revealed that while frontoparietal motor and peri-motor regions are engaged in the representation of object-directed actions, medial temporal lobe areas in the left hemisphere are involved in the representation of function knowledge. These results indicate that both action and function knowledge are represented in a topographically coherent manner that is amenable to study with multivariate approaches, and that the left medial temporal cortex represents knowledge of object function. PMID:25595179

  10. Mapping and spatiotemporal analysis tool for hydrological data: Spellmap

    USDA-ARS?s Scientific Manuscript database

    Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

  11. A CROSS-SPECIES APPROACH TO USING GENOMICS TOOLS IN AQUATIC TOXICOLOGY

    EPA Science Inventory

    Microarray technology has proven to be a useful tool for analyzing the transcriptome of various organisms representing conditions such as disease states, developmental stages, and responses to chemical exposure. Most commercially available arrays are limited to organisms that ha...

  12. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...

  13. Comparing genome versus proteome-based identification of clinical bacterial isolates.

    PubMed

    Galata, Valentina; Backes, Christina; Laczny, Cédric Christian; Hemmrich-Stanisak, Georg; Li, Howard; Smoot, Laura; Posch, Andreas Emanuel; Schmolke, Susanne; Bischoff, Markus; von Müller, Lutz; Plum, Achim; Franke, Andre; Keller, Andreas

    2018-05-01

    Whole-genome sequencing (WGS) is gaining importance in the analysis of bacterial cultures derived from patients with infectious diseases. Existing computational tools for WGS-based identification have, however, been evaluated on previously defined data relying thereby unwarily on the available taxonomic information.Here, we newly sequenced 846 clinical gram-negative bacterial isolates representing multiple distinct genera and compared the performance of five tools (CLARK, Kaiju, Kraken, DIAMOND/MEGAN and TUIT). To establish a faithful 'gold standard', the expert-driven taxonomy was compared with identifications based on matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) analysis. Additionally, the tools were also evaluated using a data set of 200 Staphylococcus aureus isolates.CLARK and Kraken (with k =31) performed best with 626 (100%) and 193 (99.5%) correct species classifications for the gram-negative and S. aureus isolates, respectively. Moreover, CLARK and Kraken demonstrated highest mean F-measure values (85.5/87.9% and 94.4/94.7% for the two data sets, respectively) in comparison with DIAMOND/MEGAN (71 and 85.3%), Kaiju (41.8 and 18.9%) and TUIT (34.5 and 86.5%). Finally, CLARK, Kaiju and Kraken outperformed the other tools by a factor of 30 to 170 fold in terms of runtime.We conclude that the application of nucleotide-based tools using k-mers-e.g. CLARK or Kraken-allows for accurate and fast taxonomic characterization of bacterial isolates from WGS data. Hence, our results suggest WGS-based genotyping to be a promising alternative to the MS-based biotyping in clinical settings. Moreover, we suggest that complementary information should be used for the evaluation of taxonomic classification tools, as public databases may suffer from suboptimal annotations.

  14. Representing the Margins: Multimodal Performance as a Tool for Critical Reflection and Pedagogy

    ERIC Educational Resources Information Center

    Darvin, Ron

    2015-01-01

    This article discusses how drama as a multimodal performance can be a powerful means to represent marginalized identities and to stimulate critical thought among teachers and learners about material conditions of existence and social inequalities.

  15. Deriving the Characteristic Scale for Effectively Monitoring Heavy Metal Stress in Rice by Assimilation of GF-1 Data with the WOFOST Model

    PubMed Central

    Huang, Zhi; Liu, Xiangnan; Jin, Ming; Ding, Chao; Jiang, Jiale; Wu, Ling

    2016-01-01

    Accurate monitoring of heavy metal stress in crops is of great importance to assure agricultural productivity and food security, and remote sensing is an effective tool to address this problem. However, given that Earth observation instruments provide data at multiple scales, the choice of scale for use in such monitoring is challenging. This study focused on identifying the characteristic scale for effectively monitoring heavy metal stress in rice using the dry weight of roots (WRT) as the representative characteristic, which was obtained by assimilation of GF-1 data with the World Food Studies (WOFOST) model. We explored and quantified the effect of the important state variable LAI (leaf area index) at various spatial scales on the simulated rice WRT to find the critical scale for heavy metal stress monitoring using the statistical characteristics. Furthermore, a ratio analysis based on the varied heavy metal stress levels was conducted to identify the characteristic scale. Results indicated that the critical threshold for investigating the rice WRT in monitoring studies of heavy metal stress was larger than 64 m but smaller than 256 m. This finding represents a useful guideline for choosing the most appropriate imagery. PMID:26959033

  16. Safety of human papillomavirus vaccines: a review

    PubMed Central

    Stillo, Michela; Carrillo Santisteve, Paloma; Lopalco, Pier Luigi

    2015-01-01

    Introduction: Between 2006 and 2009, two different human papillomavirus virus (HPV) vaccines were licensed for use: a quadrivalent (qHPVv) and a bivalent (bHPVv) vaccine. Since 2008, HPV vaccination programmes have been implemented in the majority of the industrialized countries. Since 2013, HPV vaccination has been part of the national programs of 66 countries including almost all countries in North America and Western Europe. Despite all the efforts made by individual countries, coverage rates are lower than expected. Vaccine safety represents one of the main concerns associated with the lack of acceptance of HPV vaccination both in the European Union/European Economic Area and elsewhere. Areas covered: Safety data published on bivalent and quadrivalent HPV vaccines, both in pre-licensure and post-licensure phase, are reviewed. Expert opinion: Based on the latest scientific evidence, both HPV vaccines seem to be safe. Nevertheless, public concern and rumors about adverse events (AE) represent an important barrier to overcome in order to increase vaccine coverage. Passive surveillance of AEs is an important tool for detecting safety signals, but it should be complemented by activities aimed at assessing the real cause of all suspect AEs. Improved vaccine safety surveillance is the first step for effective communication based on scientific evidence. PMID:25689872

  17. Tools and Techniques for Basin-Scale Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.

    2012-12-01

    The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other options. The over-arching Study Manager provides a graphical tool to create combinations of future supply scenarios, demand scenarios, infrastructure and operating policy alternatives; each scenario is executed as an ensemble of RiverWare runs, driven by the hydrologic supply. The Study Manager sets up and manages multiple executions on multi-core hardware. The sizeable are typically direct model outputs, or post-processed indicators of performance based on model outputs. Post processing statistical analysis of the outputs are possible using the Graphical Policy Analysis Tool or other statistical packages. Several Basin Studies undertaken have used RiverWare to evaluate future scenarios. The Colorado River Basin Study, the most complex and extensive to date, has taken advantage of these tools and techniques to generate supply scenarios, produce alternative demand scenarios and to set up and execute the many combinations of supplies, demands, policies, and infrastructure alternatives. The tools and techniques will be described with example applications.

  18. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public

    PubMed Central

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314

  19. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.

    PubMed

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.

  20. Functional brain areas associated with manipulation of a prehensile tool: a PET study.

    PubMed

    Tsuda, Hayato; Aoki, Tomoko; Oku, Naohiko; Kimura, Yasuyuki; Hatazawa, Jun; Kinoshita, Hiroshi

    2009-09-01

    Using PET, brain areas representing the use of a well-learned tool (chopsticks) were investigated in 10 normal common users. The experimental task was to hold the tool in their right hand and use it to pick up and transport a small pin from a table. Data for the same task performed using only the fingers were also obtained as a control. The results showed an extensive overlap in activated areas with and without the use of the tool. The tool-use prehension, compared to the finger prehension, was associated with higher activities in the caudal-ventral premotor, dorsal premotor, superior parietal, posterior intraparietal, middle temporal gyrus, and primary sensory, occipital cortices, and the cerebellum. These are thus considered to be the human cortical and subcortical substrates representing the use of the tool studied. The activity of the posterior intraparietal area was negatively correlated with the number of drops of the pin, whereas occipital activity was positively correlated with the same error parameter. The caudal-ventral premotor and posterior intraparietal areas are together known to be involved in tool use-related modulation in peripersonal space. The correlation results suggest that this modulation depends on the level of performance. The coactivated left middle temporal gyrus further suggests that familiarity with a tool as well as the knowledge about its usage plays a role in peripersonal space modulation. Superior parietal activation, along with occipital activation, indicates the involvement of visual-spatial attention in the tool use, possibly reflecting the effect of interaction between the prehension (task) and the tool. 2009 Wiley-Liss, Inc.

  1. STS 135 Landing

    NASA Image and Video Library

    2017-12-08

    Goddard's Ritsko Wins 2011 SAVE Award The winner of the 2011 SAVE Award is Matthew Ritsko, a Goddard financial manager. His tool lending library would track and enable sharing of expensive space-flight tools and hardware after projects no longer need them. This set of images represents the types of tools used at NASA. To read more go to: www.nasa.gov/topics/people/features/ritsko-save.html Exploration Systems Project Manager Mike Weiss speaks about a Hubble Servicing Mission hand tool, developed at Goddard. Credit: NASA/GSFC/Debbie McCallum

  2. Report on New Methods for Representing and Interacting with Qualitative Geographic Information, Stage 2: Task Group 3: Social-focused Use Case

    DTIC Science & Technology

    2014-06-30

    lesson learned through exploring current data with the ForceNet tool is that the tool (as implemented thus far) is able to give analysts a big ...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...Twitter data and on the development and implementation of tools to support this task; these include a Group Builder, a Force-directed Graph tool, and a

  3. Basket Studies: Redefining Clinical Trials in the Era of Genome-Driven Oncology.

    PubMed

    Tao, Jessica J; Schram, Alison M; Hyman, David M

    2018-01-29

    Understanding a tumor's detailed molecular profile has become increasingly necessary to deliver the standard of care for patients with advanced cancer. Innovations in both tumor genomic sequencing technology and the development of drugs that target molecular alterations have fueled recent gains in genome-driven oncology care. "Basket studies," or histology-agnostic clinical trials in genomically selected patients, represent one important research tool to continue making progress in this field. We review key aspects of genome-driven oncology care, including the purpose and utility of basket studies, biostatistical considerations in trial design, genomic knowledgebase development, and patient matching and enrollment models, which are critical for translating our genomic knowledge into clinically meaningful outcomes.

  4. Foreword: Biomonitoring Equivalents special issue.

    PubMed

    Meek, M E; Sonawane, B; Becker, R A

    2008-08-01

    The challenge of interpreting results of biomonitoring for environmental chemicals in humans is highlighted in this Foreword to the Biomonitoring Equivalents (BEs) special issue of Regulatory Toxicology and Pharmacology. There is a pressing need to develop risk-based tools in order to empower scientists and health professionals to interpret and communicate the significance of human biomonitoring data. The BE approach, which integrates dosimetry and risk assessment methods, represents an important advancement on the path toward achieving this objective. The articles in this issue, developed as a result of an expert panel meeting, present guidelines for derivation of BEs, guidelines for communication using BEs and several case studies illustrating application of the BE approach for specific substances.

  5. Open-WiSe: A Solar Powered Wireless Sensor Network Platform

    PubMed Central

    González, Apolinar; Aquino, Raúl; Mata, Walter; Ochoa, Alberto; Saldaña, Pedro; Edwards, Arthur

    2012-01-01

    Because battery-powered nodes are required in wireless sensor networks and energy consumption represents an important design consideration, alternate energy sources are needed to provide more effective and optimal function. The main goal of this work is to present an energy harvesting wireless sensor network platform, the Open Wireless Sensor node (WiSe). The design and implementation of the solar powered wireless platform is described including the hardware architecture, firmware, and a POSIX Real-Time Kernel. A sleep and wake up strategy was implemented to prolong the lifetime of the wireless sensor network. This platform was developed as a tool for researchers investigating Wireless sensor network or system integrators. PMID:22969396

  6. Copper-Catalyzed Alkoxycarbonylation of Alkanes with Alcohols.

    PubMed

    Li, Yahui; Wang, Changsheng; Zhu, Fengxiang; Wang, Zechao; Dixneuf, Pierre H; Wu, Xiao-Feng

    2017-04-10

    Esters are important chemicals widely used in various areas, and alkoxycarbonylation represents one of the most powerful tools for their synthesis. In this communication, a new copper-catalyzed carbonylative procedure for the synthesis of aliphatic esters from cycloalkanes and alcohols was developed. Through direct activation of the Csp3 -H bond of alkanes and with alcohols as the nucleophiles, the desired esters were prepared in moderate-to-good yields. Paraformaldehyde could also be applied for in situ alcohol generation by radical trapping, and moderate yields of the corresponding esters could be produced. Notably, this is the first report on copper-catalyzed alkoxycarbonylation of alkanes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Parsing Citations in Biomedical Articles Using Conditional Random Fields

    PubMed Central

    Zhang, Qing; Cao, Yong-Gang; Yu, Hong

    2011-01-01

    Citations are used ubiquitously in biomedical full-text articles and play an important role for representing both the rhetorical structure and the semantic content of the articles. As a result, text mining systems will significantly benefit from a tool that automatically extracts the content of a citation. In this study, we applied the supervised machine-learning algorithms Conditional Random Fields (CRFs) to automatically parse a citation into its fields (e.g., Author, Title, Journal, and Year). With a subset of html format open-access PubMed Central articles, we report an overall 97.95% F1-score. The citation parser can be accessed at: http://www.cs.uwm.edu/~qing/projects/cithit/index.html. PMID:21419403

  8. Methodological update in Medicina Intensiva.

    PubMed

    García Garmendia, J L

    2018-04-01

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  9. The Role of Education as a Tool in Transmitting Cultural Stereotypes Words (Formal's): The Case of "Kerem and Asli" Story

    ERIC Educational Resources Information Center

    Bulut, Mesut; Bars, Mehmet Emin

    2013-01-01

    In terms of the individual and society folk literature is an important educational tool product; plays an important role in the transmission of culture between generations is an important element of the social culture. Which is an important educational tool for the individual and society folk literature, folk tales products, is one of the major…

  10. Barrel organ of plate tectonics - a new tool for outreach and education

    NASA Astrophysics Data System (ADS)

    Broz, Petr; Machek, Matěj; Šorm, Zdar

    2016-04-01

    Plate tectonics is the major geological concept to explain dynamics and structure of Earth's outer shell, the lithosphere. In the plate tectonic theory processes in the Earth lithosphere and its dynamics is driven by the relative motion and interaction of lithospheric plates. Geologically most active regions on Earth often correlate with the lithospheric plate boundaries. Thus for explaining the earth surface evolution, mountain building, volcanism and earthquake origin it is important to understand processes at the plate boundaries. However these processes associated with plate tectonics usually require significant period of time to take effects, therefore, their entire cycles cannot be directly observed in the nature by humans. This makes a challenge for scientists studying these processes, but also for teachers and popularizers trying to explain them to students and to the general public. Therefore, to overcome this problem, we developed a mechanical model of plate tectonics enabling demonstration of most important processes associated with plate tectonics in real time. The mechanical model is a wooden box, more specifically a special type of barrel organ, with hand painted backdrops in the front side. These backdrops are divided into several components representing geodynamic processes associated with plate tectonics, specifically convective currents occurring in the mantle, sea-floor spreading, a subduction of the oceanic crust under the continental crust, partial melting and volcanism associated with subduction, a formation of magmatic stripes, an ascent of mantle plume throughout the mantle, a volcanic activity associated with hot spots, and a formation and degradation of volcanic islands on moving lithospheric plate. All components are set in motion by a handle controlled by a human operator, and the scene is illuminated with colored lights controlled automatically by an electric device embedded in the box. Operation of the model may be seen on www.geologyinexperiments.com where additional pictures and details about the construction are available. This mechanical model represents a unique outreach tool how to present processes, normally taking eons to occur, to students and to the public in easy and funny way, and how to attract their attention to the most important concept in geology.

  11. Construction of high quality Gateway™ entry libraries and their application to yeast two-hybrid for the monocot model plant Brachypodium distachyon.

    PubMed

    Cao, Shuanghe; Siriwardana, Chamindika L; Kumimoto, Roderick W; Holt, Ben F

    2011-05-19

    Monocots, especially the temperate grasses, represent some of the most agriculturally important crops for both current food needs and future biofuel development. Because most of the agriculturally important grass species are difficult to study (e.g., they often have large, repetitive genomes and can be difficult to grow in laboratory settings), developing genetically tractable model systems is essential. Brachypodium distachyon (hereafter Brachypodium) is an emerging model system for the temperate grasses. To fully realize the potential of this model system, publicly accessible discovery tools are essential. High quality cDNA libraries that can be readily adapted for multiple downstream purposes are a needed resource. Additionally, yeast two-hybrid (Y2H) libraries are an important discovery tool for protein-protein interactions and are not currently available for Brachypodium. We describe the creation of two high quality, publicly available Gateway™ cDNA entry libraries and their derived Y2H libraries for Brachypodium. The first entry library represents cloned cDNA populations from both short day (SD, 8/16-h light/dark) and long day (LD, 20/4-h light/dark) grown plants, while the second library was generated from hormone treated tissues. Both libraries have extensive genome coverage (~5 × 107 primary clones each) and average clone lengths of ~1.5 Kb. These entry libraries were then used to create two recombination-derived Y2H libraries. Initial proof-of-concept screens demonstrated that a protein with known interaction partners could readily re-isolate those partners, as well as novel interactors. Accessible community resources are a hallmark of successful biological model systems. Brachypodium has the potential to be a broadly useful model system for the grasses, but still requires many of these resources. The Gateway™ compatible entry libraries created here will facilitate studies for multiple user-defined purposes and the derived Y2H libraries can be immediately applied to large scale screening and discovery of novel protein-protein interactions. All libraries are freely available for distribution to the research community.

  12. Measurement and Research Tools. Symposium 37. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This symposium on measurement and research tools consists of three presentations. "An Examination of the Multiple Intelligences Developmental Assessment Scales (MIDAS)" (Albert Wiswell et al.) explores MIDAS's psychometric saliency. Findings indicates this instrument represents an incomplete attempt to develop a valid assessment of…

  13. CADMIO: computer aided design for medical information objects.

    PubMed

    Minarelli, D V; Ferri, F; Pisanelli, D M; Ricci, F L; Tittarelli, F

    1995-01-01

    The growth of the computational capability and the tools of graphic software is nowadays available in an integrated manner into the development environments, thus permitting the realization of tool kits capable of handling information that is complex and of different kinds such as the typical medical information. This has given a great impulse to the creation of electronic medical folders joining together with new and stimulating functionality with respect to the usual paper document [1]. In the present work, we propose a tool capable of defining a multimedia electronic medical folder and representing its architecture through a layout that is formed on the basis of the particular data types to be handled. This tool is capable of providing an integrated view of data that, even though they are close in cognitive sense, are often stored and represented apart in the practice. Different approaches to the browsing feature are giving within the system, thus the user can personalize the way of viewing the information stored into the folder or can let the system guide the browsing.

  14. Insightful problem solving and creative tool modification by captive nontool-using rooks.

    PubMed

    Bird, Christopher D; Emery, Nathan J

    2009-06-23

    The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use.

  15. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  16. Interchangeable end effector tools utilized on the protoflight manipulator arm

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A subset of teleoperator and effector tools was designed, fabricated, delivered and successfully demonstrated on the Marshall Space Flight Center (MSFC) protoflight manipulator arm (PFMA). The tools delivered included a rotary power tool with interchangeable collets and two fluid coupling mate/demate tools; one for a Fairchild coupling and the other for a Purolator coupling. An electrical interface connector was also provided for the rotary power tool. A tool set, from which the subset was selected, for performing on-orbit satellite maintenance was identified and conceptionally designed. Maintenance requirements were synthesized, evaluated and prioritized to develop design requirements for a set of end effector tools representative of those needed to provide on-orbit maintenance of satellites to be flown in the 1986 to 2000 timeframe.

  17. Development and psychometric testing of the childhood obesity perceptions (COP) survey among African American caregivers: A tool for obesity prevention program planning.

    PubMed

    Alexander, Dayna S; Alfonso, Moya L; Cao, Chunhua

    2016-12-01

    Currently, public health practitioners are analyzing the role that caregivers play in childhood obesity efforts. Assessing African American caregiver's perceptions of childhood obesity in rural communities is an important prevention effort. This article's objective is to describe the development and psychometric testing of a survey tool to assess childhood obesity perceptions among African American caregivers in a rural setting, which can be used for obesity prevention program development or evaluation. The Childhood Obesity Perceptions (COP) survey was developed to reflect the multidimensional nature of childhood obesity including risk factors, health complications, weight status, built environment, and obesity prevention strategies. A 97-item survey was pretested and piloted with the priority population. After pretesting and piloting, the survey was reduced to 59-items and administered to 135 African American caregivers. An exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) was conducted to test how well the survey items represented the number of Social Cognitive Theory constructs. Twenty items were removed from the original 59-item survey and acceptable internal consistency of the six factors (α=0.70-0.85) was documented for all scales in the final COP instrument. CFA resulted in a less than adequate fit; however, a multivariate Lagrange multiplier test identified modifications to improve the model fit. The COP survey represents a promising approach as a potentially comprehensive assessment for implementation or evaluation of childhood obesity programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Gametic embryogenesis and haploid technology as valuable support to plant breeding.

    PubMed

    Germanà, Maria Antonietta

    2011-05-01

    Plant breeding is focused on continuously increasing crop production to meet the needs of an ever-growing world population, improving food quality to ensure a long and healthy life and address the problems of global warming and environment pollution, together with the challenges of developing novel sources of biofuels. The breeders' search for novel genetic combinations, with which to select plants with improved traits to satisfy both farmers and consumers, is endless. About half of the dramatic increase in crop yield obtained in the second half of the last century has been achieved thanks to the results of genetic improvement, while the residual advance has been due to the enhanced management techniques (pest and disease control, fertilization, and irrigation). Biotechnologies provide powerful tools for plant breeding, and among these ones, tissue culture, particularly haploid and doubled haploid technology, can effectively help to select superior plants. In fact, haploids (Hs), which are plants with gametophytic chromosome number, and doubled haploids (DHs), which are haploids that have undergone chromosome duplication, represent a particularly attractive biotechnological method to accelerate plant breeding. Currently, haploid technology, making possible through gametic embryogenesis the single-step development of complete homozygous lines from heterozygous parents, has already had a huge impact on agricultural systems of many agronomically important crops, representing an integral part in their improvement programmes. The aim of this review was to provide some background, recent advances, and future prospective on the employment of haploid technology through gametic embryogenesis as a powerful tool to support plant breeding.

  19. Detection of changes in gene regulatory patterns, elicited by perturbations of the Hsp90 molecular chaperone complex, by visualizing multiple experiments with an animation

    PubMed Central

    2011-01-01

    Background To make sense out of gene expression profiles, such analyses must be pushed beyond the mere listing of affected genes. For example, if a group of genes persistently display similar changes in expression levels under particular experimental conditions, and the proteins encoded by these genes interact and function in the same cellular compartments, this could be taken as very strong indicators for co-regulated protein complexes. One of the key requirements is having appropriate tools to detect such regulatory patterns. Results We have analyzed the global adaptations in gene expression patterns in the budding yeast when the Hsp90 molecular chaperone complex is perturbed either pharmacologically or genetically. We integrated these results with publicly accessible expression, protein-protein interaction and intracellular localization data. But most importantly, all experimental conditions were simultaneously and dynamically visualized with an animation. This critically facilitated the detection of patterns of gene expression changes that suggested underlying regulatory networks that a standard analysis by pairwise comparison and clustering could not have revealed. Conclusions The results of the animation-assisted detection of changes in gene regulatory patterns make predictions about the potential roles of Hsp90 and its co-chaperone p23 in regulating whole sets of genes. The simultaneous dynamic visualization of microarray experiments, represented in networks built by integrating one's own experimental with publicly accessible data, represents a powerful discovery tool that allows the generation of new interpretations and hypotheses. PMID:21672238

  20. Determining the direct upland hydrological contribution area of estuarine wetlands using Arc/GIS tools

    EPA Science Inventory

    The delineation of a polygon layer representing the direct upland runoff contribution to esturine wetland polygons can be a useful tool in estuarine wetland assessment. However, the traditional methods of watershed delineation using pour points and digital elevation models (DEMs)...

  1. Microcomputer-Based Intelligent Tutoring Systems: An Assessment.

    ERIC Educational Resources Information Center

    Schaffer, John William

    Computer-assisted instruction, while familiar to most teachers, has failed to become an effective self-motivating instructional tool. Developments in artificial intelligence, however, have provided new and better tools for exploring human knowledge acquisition and utilization. Expert system technology represents one of the most promising of these…

  2. A Survey of FDG- and Amyloid-PET Imaging in Dementia and GRADE Analysis

    PubMed Central

    Daniela, Perani; Orazio, Schillaci; Alessandro, Padovani; Mariano, Nobili Flavio; Leonardo, Iaccarino; Pasquale Anthony, Della Rosa; Giovanni, Frisoni; Carlo, Caltagirone

    2014-01-01

    PET based tools can improve the early diagnosis of Alzheimer's disease (AD) and differential diagnosis of dementia. The importance of identifying individuals at risk of developing dementia among people with subjective cognitive complaints or mild cognitive impairment has clinical, social, and therapeutic implications. Within the two major classes of AD biomarkers currently identified, that is, markers of pathology and neurodegeneration, amyloid- and FDG-PET imaging represent decisive tools for their measurement. As a consequence, the PET tools have been recognized to be of crucial value in the recent guidelines for the early diagnosis of AD and other dementia conditions. The references based recommendations, however, include large PET imaging literature based on visual methods that greatly reduces sensitivity and specificity and lacks a clear cut-off between normal and pathological findings. PET imaging can be assessed using parametric or voxel-wise analyses by comparing the subject's scan with a normative data set, significantly increasing the diagnostic accuracy. This paper is a survey of the relevant literature on FDG and amyloid-PET imaging aimed at providing the value of quantification for the early and differential diagnosis of AD. This allowed a meta-analysis and GRADE analysis revealing high values for PET imaging that might be useful in considering recommendations. PMID:24772437

  3. An assessment of survey measures used across key epidemiologic studies of United States Gulf War I Era Veterans

    PubMed Central

    2013-01-01

    Over the past two decades, 12 large epidemiologic studies and 2 registries have focused on U.S. veterans of the 1990–1991 Gulf War Era. We conducted a review of these studies’ research tools to identify existing gaps and overlaps of efforts to date, and to advance development of the next generation of Gulf War Era survey tools. Overall, we found that many of the studies used similar instruments. Questions regarding exposures were more similar across studies than other domains, while neurocognitive and psychological tools were the most variable. Many studies focused on self-reported survey results, with a range of validation practices. However, physical exams, biomedical assessments, and specimen storage were not common. This review suggests that while research may be able to pool data from past surveys, future surveys need to consider how their design can yield data comparable with previous surveys. Additionally, data that incorporate recent technologies in specimen and genetic analyses would greatly enhance such survey data. When combined with existing data on deployment-related exposures and post-deployment health conditions, longitudinal follow-up of existing studies within this collaborative framework could represent an important step toward improving the health of veterans. PMID:23302181

  4. Development and Application of Camelid Molecular Cytogenetic Tools

    PubMed Central

    Avila, Felipe; Das, Pranab J.; Kutzler, Michelle; Owens, Elaine; Perelman, Polina; Rubes, Jiri; Hornak, Miroslav; Johnson, Warren E.

    2014-01-01

    Cytogenetic chromosome maps offer molecular tools for genome analysis and clinical cytogenetics and are of particular importance for species with difficult karyotypes, such as camelids (2n = 74). Building on the available human–camel zoo-fluorescence in situ hybridization (FISH) data, we developed the first cytogenetic map for the alpaca (Lama pacos, LPA) genome by isolating and identifying 151 alpaca bacterial artificial chromosome (BAC) clones corresponding to 44 specific genes. The genes were mapped by FISH to 31 alpaca autosomes and the sex chromosomes; 11 chromosomes had 2 markers, which were ordered by dual-color FISH. The STS gene mapped to Xpter/Ypter, demarcating the pseudoautosomal region, whereas no markers were assigned to chromosomes 14, 21, 22, 28, and 36. The chromosome-specific markers were applied in clinical cytogenetics to identify LPA20, the major histocompatibility complex (MHC)-carrying chromosome, as a part of an autosomal translocation in a sterile male llama (Lama glama, LGL; 2n = 73,XY). FISH with LPAX BACs and LPA36 paints, as well as comparative genomic hybridization, were also used to investigate the origin of the minute chromosome, an abnormally small LPA36 in infertile female alpacas. This collection of cytogenetically mapped markers represents a new tool for camelid clinical cytogenetics and has applications for the improvement of the alpaca genome map and sequence assembly. PMID:23109720

  5. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  6. A new method for studying population genetics of cyst nematodes based on Pool-Seq and genomewide allele frequency analysis.

    PubMed

    Mimee, Benjamin; Duceppe, Marc-Olivier; Véronneau, Pierre-Yves; Lafond-Lapalme, Joël; Jean, Martine; Belzile, François; Bélair, Guy

    2015-11-01

    Cyst nematodes are important agricultural pests responsible for billions of dollars of losses each year. Plant resistance is the most effective management tool, but it requires a close monitoring of population genetics. Current technologies for pathotyping and genotyping cyst nematodes are time-consuming, expensive and imprecise. In this study, we capitalized on the reproduction mode of cyst nematodes to develop a simple population genetic analysis pipeline based on genotyping-by-sequencing and Pool-Seq. This method yielded thousands of SNPs and allowed us to study the relationships between populations of different origins or pathotypes. Validation of the method on well-characterized populations also demonstrated that it was a powerful and accurate tool for population genetics. The genomewide allele frequencies of 23 populations of golden nematode, from nine countries and representing the five known pathotypes, were compared. A clear separation of the pathotypes and fine genetic relationships between and among global populations were obtained using this method. In addition to being powerful, this tool has proven to be very time- and cost-efficient and could be applied to other cyst nematode species. © 2015 Her Majesty the Queen in Right of Canada Molecular Ecology Resources © 2015 John Wiley & Sons Ltd Reproduced with the permission of the Minister of Agriculture and Agri-food.

  7. New trends in radiology workstation design

    NASA Astrophysics Data System (ADS)

    Moise, Adrian; Atkins, M. Stella

    2002-05-01

    In the radiology workstation design, the race for adding more features is now morphing into an iterative user centric design with the focus on ergonomics and usability. The extent of the list of features for the radiology workstation used to be one of the most significant factors for a Picture Archiving and Communication System (PACS) vendor's ability to sell the radiology workstation. Not anymore is now very much the same between the major players in the PACS market. How these features work together distinguishes different radiology workstations. Integration (with the PACS/Radiology Information System (RIS) systems, with the 3D tool, Reporting Tool etc.), usability (user specific preferences, advanced display protocols, smart activation of tools etc.) and efficiency (what is the output a radiologist can generate with the workstation) are now core factors for selecting a workstation. This paper discusses these new trends in radiology workstation design. We demonstrate the importance of the interaction between the PACS vendor (software engineers) and the customer (radiologists) during the radiology workstation design. We focus on iterative aspects of the workstation development, such as the presentation of early prototypes to as many representative users as possible during the software development cycle and present the results of a survey of 8 radiologists on designing a radiology workstation.

  8. First Studies for the Development of Computational Tools for the Design of Liquid Metal Electromagnetic Pumps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maidana, Carlos O.; Nieminen, Juha E.

    Liquid alloy systems have a high degree of thermal conductivity, far superior to ordinary nonmetallic liquids and inherent high densities and electrical conductivities. This results in the use of these materials for specific heat conducting and dissipation applications for the nuclear and space sectors. Uniquely, they can be used to conduct heat and electricity between nonmetallic and metallic surfaces. The motion of liquid metals in strong magnetic fields generally induces electric currents, which, while interacting with the magnetic field, produce electromagnetic forces. Electromagnetic pumps exploit the fact that liquid metals are conducting fluids capable of carrying currents, which is amore » source of electromagnetic fields useful for pumping and diagnostics. The coupling between the electromagnetics and thermo-fluid mechanical phenomena and the determination of its geometry and electrical configuration, gives rise to complex engineering magnetohydrodynamics problems. The development of tools to model, characterize, design, and build liquid metal thermomagnetic systems for space, nuclear, and industrial applications are of primordial importance and represent a cross-cutting technology that can provide unique design and development capabilities as well as a better understanding of the physics behind the magneto-hydrodynamics of liquid metals. Here, first studies for the development of computational tools for the design of liquid metal electromagnetic pumps are discussed.« less

  9. First Studies for the Development of Computational Tools for the Design of Liquid Metal Electromagnetic Pumps

    DOE PAGES

    Maidana, Carlos O.; Nieminen, Juha E.

    2017-02-01

    Liquid alloy systems have a high degree of thermal conductivity, far superior to ordinary nonmetallic liquids and inherent high densities and electrical conductivities. This results in the use of these materials for specific heat conducting and dissipation applications for the nuclear and space sectors. Uniquely, they can be used to conduct heat and electricity between nonmetallic and metallic surfaces. The motion of liquid metals in strong magnetic fields generally induces electric currents, which, while interacting with the magnetic field, produce electromagnetic forces. Electromagnetic pumps exploit the fact that liquid metals are conducting fluids capable of carrying currents, which is amore » source of electromagnetic fields useful for pumping and diagnostics. The coupling between the electromagnetics and thermo-fluid mechanical phenomena and the determination of its geometry and electrical configuration, gives rise to complex engineering magnetohydrodynamics problems. The development of tools to model, characterize, design, and build liquid metal thermomagnetic systems for space, nuclear, and industrial applications are of primordial importance and represent a cross-cutting technology that can provide unique design and development capabilities as well as a better understanding of the physics behind the magneto-hydrodynamics of liquid metals. Here, first studies for the development of computational tools for the design of liquid metal electromagnetic pumps are discussed.« less

  10. Transgenic barley: a prospective tool for biotechnology and agriculture.

    PubMed

    Mrízová, Katarína; Holasková, Edita; Öz, M Tufan; Jiskrová, Eva; Frébort, Ivo; Galuszka, Petr

    2014-01-01

    Barley (Hordeum vulgare L.) is one of the founder crops of agriculture, and today it is the fourth most important cereal grain worldwide. Barley is used as malt in brewing and distilling industry, as an additive for animal feed, and as a component of various food and bread for human consumption. Progress in stable genetic transformation of barley ensures a potential for improvement of its agronomic performance or use of barley in various biotechnological and industrial applications. Recently, barley grain has been successfully used in molecular farming as a promising bioreactor adapted for production of human therapeutic proteins or animal vaccines. In addition to development of reliable transformation technologies, an extensive amount of various barley genetic resources and tools such as sequence data, microarrays, genetic maps, and databases has been generated. Current status on barley transformation technologies including gene transfer techniques, targets, and progeny stabilization, recent trials for improvement of agricultural traits and performance of barley, especially in relation to increased biotic and abiotic stress tolerance, and potential use of barley grain as a protein production platform have been reviewed in this study. Overall, barley represents a promising tool for both agricultural and biotechnological transgenic approaches, and is considered an ancient but rediscovered crop as a model industrial platform for molecular farming. © 2013 Elsevier Inc. All rights reserved.

  11. Web tools for effective retrieval, visualization, and evaluation of cardiology medical images and records

    NASA Astrophysics Data System (ADS)

    Masseroli, Marco; Pinciroli, Francesco

    2000-12-01

    To provide easy retrieval, integration and evaluation of multimodal cardiology images and data in a web browser environment, distributed application technologies and java programming were used to implement a client-server architecture based on software agents. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. The client side is a Java applet running in a web browser and providing a friendly medical user interface to perform queries on patient and medical test dat and integrate and visualize properly the various query results. A set of tools based on Java Advanced Imaging API enables to process and analyze the retrieved cardiology images, and quantify their features in different regions of interest. The platform-independence Java technology makes the developed prototype easy to be managed in a centralized form and provided in each site where an intranet or internet connection can be located. Giving the healthcare providers effective tools for querying, visualizing and evaluating comprehensively cardiology medical images and records in all locations where they can need them- i.e. emergency, operating theaters, ward, or even outpatient clinics- the developed prototype represents an important aid in providing more efficient diagnoses and medical treatments.

  12. Prescription and over-the-counter medications tool kit (April, 2011 version)

    DOT National Transportation Integrated Search

    2011-04-01

    This Toolkit is a compilation of policies, procedures, forms, and training resources that represent the best practices being used throughout the U.S. by a variety of transit systems. It does not represent all of the effective means that transit syste...

  13. ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs

    PubMed Central

    2011-01-01

    Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938

  14. Landslide inventory maps: New tools for an old problem

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto; Mondini, Alessandro Cesare; Cardinali, Mauro; Fiorucci, Federica; Santangelo, Michele; Chang, Kang-Tsung

    2012-04-01

    Landslides are present in all continents, and play an important role in the evolution of landscapes. They also represent a serious hazard in many areas of the world. Despite their importance, we estimate that landslide maps cover less than 1% of the slopes in the landmasses, and systematic information on the type, abundance, and distribution of landslides is lacking. Preparing landslide maps is important to document the extent of landslide phenomena in a region, to investigate the distribution, types, pattern, recurrence and statistics of slope failures, to determine landslide susceptibility, hazard, vulnerability and risk, and to study the evolution of landscapes dominated by mass-wasting processes. Conventional methods for the production of landslide maps rely chiefly on the visual interpretation of stereoscopic aerial photography, aided by field surveys. These methods are time consuming and resource intensive. New and emerging techniques based on satellite, airborne, and terrestrial remote sensing technologies, promise to facilitate the production of landslide maps, reducing the time and resources required for their compilation and systematic update. In this work, we first outline the principles for landslide mapping, and we review the conventional methods for the preparation of landslide maps, including geomorphological, event, seasonal, and multi-temporal inventories. Next, we examine recent and new technologies for landslide mapping, considering (i) the exploitation of very-high resolution digital elevation models to analyze surface morphology, (ii) the visual interpretation and semi-automatic analysis of different types of satellite images, including panchromatic, multispectral, and synthetic aperture radar images, and (iii) tools that facilitate landslide field mapping. Next, we discuss the advantages and the limitations of the new remote sensing data and technology for the production of geomorphological, event, seasonal, and multi-temporal inventory maps. We conclude by arguing that the new tools will help to improve the quality of landslide maps, with positive effects on all derivative products and analyses, including erosion studies and landscape modeling, susceptibility and hazard assessments, and risk evaluations.

  15. Exploring physics concepts among novice teachers through CMAP tools

    NASA Astrophysics Data System (ADS)

    Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.

    2018-03-01

    Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.

  16. Systematically Extracting Metal- and Solvent-Related Occupational Information from Free-Text Responses to Lifetime Occupational History Questionnaires

    PubMed Central

    Friesen, Melissa C.; Locke, Sarah J.; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A.; Purdue, Mark; Colt, Joanne S.

    2014-01-01

    Objectives: Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants’ jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Methods: Our study population comprised 2408 subjects, reporting 11991 jobs, from a case–control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert’s independently assigned probability ratings to evaluate whether we missed identifying possibly exposed jobs. Results: Our process added exposure variables for 52 occupation groups, 43 industry groups, and 46 task/tool/chemical scenarios to the data set of OH responses. Across all four agents, we identified possibly exposed task/tool/chemical exposure scenarios in 44–51% of the jobs in possibly exposed occupations. Possibly exposed task/tool/chemical exposure scenarios were found in a nontrivial 9–14% of the jobs not in possibly exposed occupations, suggesting that our process identified important information that would not be captured using occupation alone. Our extraction process was sensitive: for jobs where our extraction of OH responses identified no exposure scenarios and for which the sole source of information was the OH responses, only 0.1% were assessed as possibly exposed to TCE by the expert. Conclusions: Our systematic extraction of OH information found useful information in the task/chemicals/tools responses that was relatively easy to extract and that was not available from the occupational or industry information. The extracted variables can be used as inputs in the development of decision rules, especially for jobs where no additional information, such as job- and industry-specific questionnaires, is available. PMID:24590110

  17. Coupled behavior of shape memory alloy-based morphing spacecraft radiators: experimental assessment and analysis

    NASA Astrophysics Data System (ADS)

    Bertagne, C.; Walgren, P.; Erickson, L.; Sheth, R.; Whitcomb, J.; Hartl, D.

    2018-06-01

    Thermal control is an important aspect of spacecraft design, particularly in the case of crewed vehicles, which must maintain a precise internal temperature at all times in spite of significant variations in the external thermal environment and internal heat loads. Future missions beyond low Earth orbit will require radiator systems with high turndown ratios, defined as the ratio between the maximum and minimum heat rejection rates achievable by the radiator system. Current radiators are only able to achieve turndown ratios of 3:1, far less than the 12:1 turndown ratio requirement expected for future missions. An innovative morphing radiator concept uses the temperature-induced phase transformation of shape memory alloy (SMA) materials to achieve turndown ratios that are predicted to exceed 12:1 via substantial geometric reconfiguration. Developing mathematical and computational models of these morphing radiators is challenging due to the strong two-way thermomechanical coupling not present in traditional fixed-geometry radiators and not widely considered in the literature. Although existing simulation tools are capable of analyzing the behavior of some thermomechanically coupled structures, general problems involving radiation and deformation cannot be modeled using publicly available codes due to the complexity of modeling spatially evolving boundary fields. This paper provides important insight into the operational response of SMA-based morphing radiators by employing computational tools developed to overcome previous shortcomings. Several example problems are used to demonstrate the novel radiator concept. Additionally, a prototype morphing radiator was designed, fabricated, and tested in a thermal environment compatible with mission operations. An associated finite element model of the prototype was developed and executed. Model predictions of radiator performance generally agree with the experimental data, giving confidence that the tools developed are able to accurately represent the thermomechanical coupling present in morphing radiators and that such tools will be useful in future designs.

  18. IGES transformer and NURBS in grid generation

    NASA Technical Reports Server (NTRS)

    Yu, Tzu-Yi; Soni, Bharat K.

    1993-01-01

    In the field of Grid Generation and the CAD/CAM, there are numerous geometry output formats which require the designer to spend a great deal of time manipulating geometrical entities in order to achieve a useful sculptured geometrical description for grid generation. Also in this process, there is a danger of losing fidelity of the geometry under consideration. This stresses the importance of a standard geometry definition for the communication link between varying CAD/CAM and grid system. The IGES (Initial Graphics Exchange Specification) file is a widely used communication between CAD/CAM and the analysis tools. The scientists at NASA Research Centers - including NASA Ames, NASA Langley, NASA Lewis, NASA Marshall - have recognized this importance and, therefore, in 1992 they formed the committee of the 'NASA-IGES' which is the subset of the standard IGES. This committee stresses the importance and encourages the CFD community to use the standard IGES file for the interface between the CAD/CAM and CFD analysis. Also, two of the IGES entities -- the NURBS Curve (Entity 126) and NURBS Surface (Entity 128) -- which have many useful geometric properties -- like the convex hull property, local control property and affine invariance, also widely utilized analytical geometries can be accurately represented using NURBS. This is important in today grid generation tools because of the emphasis of the interactive design. To satisfy the geometry transformation between the CAD/CAM system and Grid Generation field, the CAGI (Computer Aided Geometry Design) developed, which include the Geometry Transformation, Geometry Manipulation and Geometry Generation as well as the user interface. This paper will present the successful development IGES file transformer and application of NURBS definition in the grid generation.

  19. Figure Text Extraction in Biomedical Literature

    PubMed Central

    Kim, Daehyun; Yu, Hong

    2011-01-01

    Background Figures are ubiquitous in biomedical full-text articles, and they represent important biomedical knowledge. However, the sheer volume of biomedical publications has made it necessary to develop computational approaches for accessing figures. Therefore, we are developing the Biomedical Figure Search engine (http://figuresearch.askHERMES.org) to allow bioscientists to access figures efficiently. Since text frequently appears in figures, automatically extracting such text may assist the task of mining information from figures. Little research, however, has been conducted exploring text extraction from biomedical figures. Methodology We first evaluated an off-the-shelf Optical Character Recognition (OCR) tool on its ability to extract text from figures appearing in biomedical full-text articles. We then developed a Figure Text Extraction Tool (FigTExT) to improve the performance of the OCR tool for figure text extraction through the use of three innovative components: image preprocessing, character recognition, and text correction. We first developed image preprocessing to enhance image quality and to improve text localization. Then we adapted the off-the-shelf OCR tool on the improved text localization for character recognition. Finally, we developed and evaluated a novel text correction framework by taking advantage of figure-specific lexicons. Results/Conclusions The evaluation on 382 figures (9,643 figure texts in total) randomly selected from PubMed Central full-text articles shows that FigTExT performed with 84% precision, 98% recall, and 90% F1-score for text localization and with 62.5% precision, 51.0% recall and 56.2% F1-score for figure text extraction. When limiting figure texts to those judged by domain experts to be important content, FigTExT performed with 87.3% precision, 68.8% recall, and 77% F1-score. FigTExT significantly improved the performance of the off-the-shelf OCR tool we used, which on its own performed with 36.6% precision, 19.3% recall, and 25.3% F1-score for text extraction. In addition, our results show that FigTExT can extract texts that do not appear in figure captions or other associated text, further suggesting the potential utility of FigTExT for improving figure search. PMID:21249186

  20. Figure text extraction in biomedical literature.

    PubMed

    Kim, Daehyun; Yu, Hong

    2011-01-13

    Figures are ubiquitous in biomedical full-text articles, and they represent important biomedical knowledge. However, the sheer volume of biomedical publications has made it necessary to develop computational approaches for accessing figures. Therefore, we are developing the Biomedical Figure Search engine (http://figuresearch.askHERMES.org) to allow bioscientists to access figures efficiently. Since text frequently appears in figures, automatically extracting such text may assist the task of mining information from figures. Little research, however, has been conducted exploring text extraction from biomedical figures. We first evaluated an off-the-shelf Optical Character Recognition (OCR) tool on its ability to extract text from figures appearing in biomedical full-text articles. We then developed a Figure Text Extraction Tool (FigTExT) to improve the performance of the OCR tool for figure text extraction through the use of three innovative components: image preprocessing, character recognition, and text correction. We first developed image preprocessing to enhance image quality and to improve text localization. Then we adapted the off-the-shelf OCR tool on the improved text localization for character recognition. Finally, we developed and evaluated a novel text correction framework by taking advantage of figure-specific lexicons. The evaluation on 382 figures (9,643 figure texts in total) randomly selected from PubMed Central full-text articles shows that FigTExT performed with 84% precision, 98% recall, and 90% F1-score for text localization and with 62.5% precision, 51.0% recall and 56.2% F1-score for figure text extraction. When limiting figure texts to those judged by domain experts to be important content, FigTExT performed with 87.3% precision, 68.8% recall, and 77% F1-score. FigTExT significantly improved the performance of the off-the-shelf OCR tool we used, which on its own performed with 36.6% precision, 19.3% recall, and 25.3% F1-score for text extraction. In addition, our results show that FigTExT can extract texts that do not appear in figure captions or other associated text, further suggesting the potential utility of FigTExT for improving figure search.

  1. Development of the Veritas plot and its application in cardiac surgery: an evidence-synthesis graphic tool for the clinician to assess multiple meta-analyses reporting on a common outcome.

    PubMed

    Panesar, Sukhmeet S; Rao, Christopher; Vecht, Joshua A; Mirza, Saqeb B; Netuveli, Gopalakrishnan; Morris, Richard; Rosenthal, Joe; Darzi, Ara; Athanasiou, Thanos

    2009-10-01

    Meta-analyses may be prone to generating misleading results because of a paucity of experimental studies (especially in surgery); publication bias; and heterogeneity in study design, intervention and the patient population of included studies. When investigating a specific clinical or scientific question on which several relevant meta-analyses may have been published, value judgments must be applied to determine which analysis represents the most robust evidence. These value judgments should be specifically acknowledged. We designed the Veritas plot to explicitly explore important elements of quality and to facilitate decision-making by highlighting specific areas in which meta-analyses are found to be deficient. Furthermore, as a graphic tool, it may be more intuitive than when similar data are presented in a tabular or text format. The Veritas plot is an adaption of the radar plot, a graphic tool for the description of multiattribute data. Key elements of meta-analytical quality such as heterogeneity, publication bias and study design are assessed. Existing qualitative methods such as the Assessment of Multiple Systematic Reviews (AMSTAR) tool have been incorporated in addition to important considerations when interpreting surgical meta-analyses such as the year of publication and population characteristics. To demonstrate the potential of the Veritas plot to inform clinical practice, we apply the Veritas plot to the meta-analytical literature comparing the incidence of 30-day stroke in off-pump coronary artery bypass surgery and conventional coronary artery bypass surgery. We demonstrate that a visually-stimulating and practical evidence-synthesis tool can direct the clinician and scientist to a particular meta-analytical study to inform clinical practice. The Veritas plot is also cumulative and allowed us to assess the quality of evidence over time. We have presented a practical graphic application for scientists and clinicians to identify and interpret variability in meta-analyses. Although further validation of the Veritas plot is required, it may have the potential to contribute to the implementation of evidence-based practice.

  2. Establishing sustainable GHG inventory systems in African countries for Agriculture and Land Use, Land-use Change and Forestry (LULUCF)

    NASA Astrophysics Data System (ADS)

    Wirth, T. C.; Troxler, T.

    2015-12-01

    As signatories to the United Nations Framework Convention on Climate Change (UNFCCC), developing countries are required to produce greenhouse gas (GHG) inventories every two years. For many developing countries, including many of those in Africa, this is a significant challenge as it requires establishing a robust and sustainable GHG inventory system. In order to help support these efforts, the U.S. Environmental Protection Agency (EPA) has worked in collaboration with the UNFCCC to assist African countries in establishing sustainable GHG inventory systems and generating high-quality inventories on a regular basis. The sectors we have focused on for these GHG inventory capacity building efforts in Africa are Agriculture and Land Use, Land-use Change and Forestry (LULUCF) as these tend to represent a significant portion of their GHG emissions profile and the data requirements and methodologies are often more complex than for other sectors. To support these efforts, the U.S. EPA has provided technical assistance in understanding the methods in the IPCC Guidelines, assembling activity data and emission factors, including developing land-use maps for representing a country's land base, and implementing the calculations. EPA has also supported development of various tools such as a Template Workbook that helps the country build the institutional arrangement and strong documentation that are necessary for generating GHG inventories on a regular basis, as well as performing other procedures as identified by IPCC Good Practice Guidance such as quality assurance/quality control, key category analysis and archiving. Another tool used in these projects and helps country's implement the methods from the IPCC Guidelines for the Agriculture and LULUCF sectors is the Agriculture and Land Use (ALU) tool. This tool helps countries assemble the activity data and emission factors, including supporting the import of GIS maps, and applying the equations from the IPPC Guidelines to estimate the carbon stock changes and emissions of non-CO2 GHG for all land uses and management practices as identified in the IPCC Guidelines at the Tier 1 or Tier 2 level.

  3. The predicted secretome and transmembranome of the poultry red mite Dermanyssus gallinae.

    PubMed

    Schicht, Sabine; Qi, Weihong; Poveda, Lucy; Strube, Christina

    2013-09-11

    The worldwide distributed hematophagous poultry red mite Dermanyssus gallinae (De Geer, 1778) is one of the most important pests of poultry. Even though 35 acaricide compounds are available, control of D. gallinae remains difficult due to acaricide resistances as well as food safety regulations. The current study was carried out to identify putative excretory/secretory (pES) proteins of D. gallinae since these proteins play an important role in the host-parasite interaction and therefore represent potential targets for the development of novel intervention strategies. Additionally, putative transmembrane proteins (pTM) of D. gallinae were analyzed as representatives of this protein group also serve as promising targets for new control strategies. D. gallinae pES and pTM protein prediction was based on putative protein sequences of whole transcriptome data which was parsed to different bioinformatical servers (SignalP, SecretomeP, TMHMM and TargetP). Subsequently, pES and pTM protein sequences were functionally annotated by different computational tools. Computational analysis of the D. gallinae proteins identified 3,091 pES (5.6%) and 7,361 pTM proteins (13.4%). A significant proportion of pES proteins are considered to be involved in blood feeding and digestion such as salivary proteins, proteases, lipases and carbohydrases. The cysteine proteases cathepsin D and L as well as legumain, enzymes that cleave hemoglobin during blood digestion of the near related ticks, represented 6 of the top-30 BLASTP matches of the poultry red mite's secretome. Identified pTM proteins may be involved in many important biological processes including cell signaling, transport of membrane-impermeable molecules and cell recognition. Ninjurin-like proteins, whose functions in mites are still unknown, represent the most frequently occurring pTM. The current study is the first providing a mite's secretome as well as transmembranome and provides valuable insights into D. gallinae pES and pTM proteins operating in different metabolic pathways. Identifying a variety of molecules putatively involved in blood feeding may significantly contribute to the development of new therapeutic targets or vaccines against this poultry pest.

  4. The predicted secretome and transmembranome of the poultry red mite Dermanyssus gallinae

    PubMed Central

    2013-01-01

    Background The worldwide distributed hematophagous poultry red mite Dermanyssus gallinae (De Geer, 1778) is one of the most important pests of poultry. Even though 35 acaricide compounds are available, control of D. gallinae remains difficult due to acaricide resistances as well as food safety regulations. The current study was carried out to identify putative excretory/secretory (pES) proteins of D. gallinae since these proteins play an important role in the host-parasite interaction and therefore represent potential targets for the development of novel intervention strategies. Additionally, putative transmembrane proteins (pTM) of D. gallinae were analyzed as representatives of this protein group also serve as promising targets for new control strategies. Methods D. gallinae pES and pTM protein prediction was based on putative protein sequences of whole transcriptome data which was parsed to different bioinformatical servers (SignalP, SecretomeP, TMHMM and TargetP). Subsequently, pES and pTM protein sequences were functionally annotated by different computational tools. Results Computational analysis of the D. gallinae proteins identified 3,091 pES (5.6%) and 7,361 pTM proteins (13.4%). A significant proportion of pES proteins are considered to be involved in blood feeding and digestion such as salivary proteins, proteases, lipases and carbohydrases. The cysteine proteases cathepsin D and L as well as legumain, enzymes that cleave hemoglobin during blood digestion of the near related ticks, represented 6 of the top-30 BLASTP matches of the poultry red mite’s secretome. Identified pTM proteins may be involved in many important biological processes including cell signaling, transport of membrane-impermeable molecules and cell recognition. Ninjurin-like proteins, whose functions in mites are still unknown, represent the most frequently occurring pTM. Conclusion The current study is the first providing a mite’s secretome as well as transmembranome and provides valuable insights into D. gallinae pES and pTM proteins operating in different metabolic pathways. Identifying a variety of molecules putatively involved in blood feeding may significantly contribute to the development of new therapeutic targets or vaccines against this poultry pest. PMID:24020355

  5. Representing Farmer Irrigation Decisions in Northern India: Model Development from the Bottom Up.

    NASA Astrophysics Data System (ADS)

    O'Keeffe, J.; Buytaert, W.; Brozovic, N.; Mijic, A.

    2017-12-01

    The plains of northern India are among the most intensely populated and irrigated regions of the world. Sustaining water demand has been made possible by exploiting the vast and hugely productive aquifers underlying the Indo-Gangetic basin. However, an increasing demand from a growing population and highly variable socio-economic and environmental variables mean present resources may not be sustainable, resulting in water security becoming one of India's biggest challenges. Unless solutions which take into consideration the regions evolving anthropogenic and environmental conditions are found, the sustainability of India's water resources looks bleak. Understanding water user decisions and their potential outcome is important for development of suitable water resource management options. Computational models are commonly used to assist water use decision making, typically representing natural processes well. The inclusion of human decision making however, one of the dominant drivers of change, has lagged behind. Improved representation of irrigation water user behaviour within models provides more accurate, relevant information for irrigation management. This research conceptualizes and proceduralizes observed farmer irrigation practices, highlighting feedbacks between the environment and livelihood. It is developed using a bottom up approach, informed through field experience and stakeholder interaction in Uttar Pradesh, northern India. Real world insights are incorporated through collected information creating a realistic representation of field conditions, providing a useful tool for policy analysis and water management. The modelling framework is applied to four districts. Results suggest predicted future climate will have little direct impact on water resources, crop yields or farmer income. In addition, increased abstraction may be sustainable in some areas under carefully managed conditions. By simulating dynamic decision making, feedbacks and interactions between water users, irrigation officials, agricultural practices, and external influences such as energy pricing and farming subsidies, this work highlights the importance of directly including water user behaviour in policy making and operational tools, which will help achieve water and livelihood security.

  6. Disentangling Vector-Borne Transmission Networks: A Universal DNA Barcoding Method to Identify Vertebrate Hosts from Arthropod Bloodmeals

    PubMed Central

    Alcaide, Miguel; Rico, Ciro; Ruiz, Santiago; Soriguer, Ramón; Muñoz, Joaquín; Figuerola, Jordi

    2009-01-01

    Emerging infectious diseases represent a challenge for global economies and public health. About one fourth of the last pandemics have been originated by the spread of vector-borne pathogens. In this sense, the advent of modern molecular techniques has enhanced our capabilities to understand vector-host interactions and disease ecology. However, host identification protocols have poorly profited of international DNA barcoding initiatives and/or have focused exclusively on a limited array of vector species. Therefore, ascertaining the potential afforded by DNA barcoding tools in other vector-host systems of human and veterinary importance would represent a major advance in tracking pathogen life cycles and hosts. Here, we show the applicability of a novel and efficient molecular method for the identification of the vertebrate host's DNA contained in the midgut of blood-feeding arthropods. To this end, we designed a eukaryote-universal forward primer and a vertebrate-specific reverse primer to selectively amplify 758 base pairs (bp) of the vertebrate mitochondrial Cytochrome c Oxidase Subunit I (COI) gene. Our method was validated using both extensive sequence surveys from the public domain and Polymerase Chain Reaction (PCR) experiments carried out over specimens from different Classes of vertebrates (Mammalia, Aves, Reptilia and Amphibia) and invertebrate ectoparasites (Arachnida and Insecta). The analysis of mosquito, culicoid, phlebotomie, sucking bugs, and tick bloodmeals revealed up to 40 vertebrate hosts, including 23 avian, 16 mammalian and one reptilian species. Importantly, the inspection and analysis of direct sequencing electropherograms also assisted the resolving of mixed bloodmeals. We therefore provide a universal and high-throughput diagnostic tool for the study of the ecology of haematophagous invertebrates in relation to their vertebrate hosts. Such information is crucial to support the efficient management of initiatives aimed at reducing epidemiologic risks of arthropod vector-borne pathogens, a priority for public health. PMID:19768113

  7. Analysis of the unexplored features of rrs (16S rDNA) of the Genus Clostridium

    PubMed Central

    2011-01-01

    Background Bacterial taxonomy and phylogeny based on rrs (16S rDNA) sequencing is being vigorously pursued. In fact, it has been stated that novel biological findings are driven by comparison and integration of massive data sets. In spite of a large reservoir of rrs sequencing data of 1,237,963 entries, this analysis invariably needs supplementation with other genes. The need is to divide the genetic variability within a taxa or genus at their rrs phylogenetic boundaries and to discover those fundamental features, which will enable the bacteria to naturally fall within them. Within the large bacterial community, Clostridium represents a large genus of around 110 species of significant biotechnological and medical importance. Certain Clostridium strains produce some of the deadliest toxins, which cause heavy economic losses. We have targeted this genus because of its high genetic diversity, which does not allow accurate typing with the available molecular methods. Results Seven hundred sixty five rrs sequences (> 1200 nucleotides, nts) belonging to 110 Clostridium species were analyzed. On the basis of 404 rrs sequences belonging to 15 Clostridium species, we have developed species specific: (i) phylogenetic framework, (ii) signatures (30 nts) and (iii) in silico restriction enzyme (14 Type II REs) digestion patterns. These tools allowed: (i) species level identification of 95 Clostridium sp. which are presently classified up to genus level, (ii) identification of 84 novel Clostridium spp. and (iii) potential reduction in the number of Clostridium species represented by small populations. Conclusions This integrated approach is quite sensitive and can be easily extended as a molecular tool for diagnostic and taxonomic identification of any microbe of importance to food industries and health services. Since rapid and correct identification allows quicker diagnosis and consequently treatment as well, it is likely to lead to reduction in economic losses and mortality rates. PMID:21223548

  8. Radiation Detection Computational Benchmark Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing differentmore » techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for compilation. This is a report describing the details of the selected Benchmarks and results from various transport codes.« less

  9. α-MHC MitoTimer mouse: In vivo mitochondrial turnover model reveals remarkable mitochondrial heterogeneity in the heart

    PubMed Central

    Stotland, Aleksandr; Gottlieb, Roberta A.

    2016-01-01

    In order to maintain an efficient, energy-producing network in the heart, dysfunctional mitochondria are cleared through the mechanism of autophagy, which is closely linked with mitochondrial biogenesis; these, together with fusion and fission comprise a crucial process known as mitochondrial turnover. Until recently, the lack of molecular tools and methods available to researchers has impeded in vivo investigations of turnover. To investigate the process at the level of a single mitochondrion, our laboratory has developed the MitoTimer protein. Timer is a mutant of DsRed fluorescent protein characterized by transition from green fluorescence to a more stable red conformation over 48 h, and its rate of maturation is stable under physiological conditions. We fused the Timer cDNA with the inner mitochondrial membrane signal sequence and placed it under the control of a cardiac-restricted promoter. This construct was used to create the alpha-MHC-MitoTimer mice. Surprisingly, initial analysis of the hearts from these mice demonstrated a high degree of heterogeneity in the ratio of red-to-green fluorescence of MitoTimer in cardiac tissue. Further, scattered solitary mitochondria within cardiomyocytes display a much higher red-to-green fluorescence (red-shifted) relative to other mitochondria in the cell, implying a block in import of newly synthesized MitoTimer likely due to lower membrane potential. These red-shifted mitochondria may represent older, senescent mitochondria. Concurrently, the cardiomyocytes also contain a subpopulation of mitochondria that display a lower red-to-green fluorescence (green-shifted) relative to other mitochondria, indicative of germinal mitochondria that are actively engaged in import of newly-synthesized mito-targeted proteins. These mitochondria can be isolated and sorted from the heart by flow cytometry for further analysis. Initial studies suggest that these mice represent an elegant tool for the investigation of mitochondrial turnover in the heart. PMID:26654779

  10. The use of 'Omics technology to rationally improve industrial mammalian cell line performance.

    PubMed

    Lewis, Amanda M; Abu-Absi, Nicholas R; Borys, Michael C; Li, Zheng Jian

    2016-01-01

    Biologics represent an increasingly important class of therapeutics, with 7 of the 10 top selling drugs from 2013 being in this class. Furthermore, health authority approval of biologics in the immuno-oncology space is expected to transform treatment of patients with debilitating and deadly diseases. The growing importance of biologics in the healthcare field has also resulted in the recent approvals of several biosimilars. These recent developments, combined with pressure to provide treatments at lower costs to payers, are resulting in increasing need for the industry to quickly and efficiently develop high yielding, robust processes for the manufacture of biologics with the ability to control quality attributes within narrow distributions. Achieving this level of manufacturing efficiency and the ability to design processes capable of regulating growth, death and other cellular pathways through manipulation of media, feeding strategies, and other process parameters will undoubtedly be facilitated through systems biology tools generated in academic and public research communities. Here we discuss the intersection of systems biology, 'Omics technologies, and mammalian bioprocess sciences. Specifically, we address how these methods in conjunction with traditional monitoring techniques represent a unique opportunity to better characterize and understand host cell culture state, shift from an empirical to rational approach to process development and optimization of bioreactor cultivation processes. We summarize the following six key areas: (i) research applied to parental, non-recombinant cell lines; (ii) systems level datasets generated with recombinant cell lines; (iii) datasets linking phenotypic traits to relevant biomarkers; (iv) data depositories and bioinformatics tools; (v) in silico model development, and (vi) examples where these approaches have been used to rationally improve cellular processes. We critically assess relevant and state of the art research being conducted in academic, government and industrial laboratories. Furthermore, we apply our expertise in bioprocess to define a potential model for integration of these systems biology approaches into biologics development. © 2015 Wiley Periodicals, Inc.

  11. Towards an Understanding of Physiological Body Mass Regulation: Seasonal Animal Models.

    PubMed

    Mercer, J G; Adam, C L; Morgan, P J

    2000-01-01

    This review is based around a number of interlinked hypotheses that can be summarised as follows: (i) mammalian body mass is regulated, (ii) the mechanisms that effect this regulation are common to all mammalian species, including humans, (iii) the neurochemical substrates involved in long term body mass regulation and in determining the level of body mass that will be defended may not be the same as those involved in short term energy homeostasis, or body mass defence, or may be differentially engaged, and (iv) "appropriate" body mass is encoded somewhere within the mammalian brain and acts as a comparator to influence both nutritional and reproductive physiology. These issues are of direct relevance to the epidemic of obesity in the Westernised human population and the poor success rate of conventional weight loss strategies. It is our contention that seasonal rodent models, and the Siberian hamster in particular, represent extremely valuable tools for the study of the mechanistic basis of body mass regulation. The Siberian hamster model is often perceived as an unusual mammalian variant that has evolved an almost counter-intuitive strategy for surviving periods of anticipated seasonal food shortage. However, there is compelling evidence that these animals are able to adjust their body mass continually and progressively according to their photoperiodic history, i.e. a seasonally-appropriate body mass. These adjustments to appropriate body mass are memorised even after the animals have been driven away from their normal body mass trajectory by imposed food restriction. Thus, photoperiod, acting through the pineal hormone, melatonin, is able to reset the desired body mass for a given time in the seasonal cycle. Importantly, daylength provides a tool to manipulate the body mass control system in an entirely physiological and stress-free manner. While resetting of body mass by photoperiod represents a level of control apparently confined to seasonal mammals, it has the potential to reveal mechanisms of generic importance in the regulation of energy homeostasis.

  12. Insightful problem solving and creative tool modification by captive nontool-using rooks

    PubMed Central

    Bird, Christopher D.; Emery, Nathan J.

    2009-01-01

    The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use. PMID:19478068

  13. KinMap: a web-based tool for interactive navigation through human kinome data.

    PubMed

    Eid, Sameh; Turk, Samo; Volkamer, Andrea; Rippmann, Friedrich; Fulle, Simone

    2017-01-05

    Annotations of the phylogenetic tree of the human kinome is an intuitive way to visualize compound profiling data, structural features of kinases or functional relationships within this important class of proteins. The increasing volume and complexity of kinase-related data underlines the need for a tool that enables complex queries pertaining to kinase disease involvement and potential therapeutic uses of kinase inhibitors. Here, we present KinMap, a user-friendly online tool that facilitates the interactive navigation through kinase knowledge by linking biochemical, structural, and disease association data to the human kinome tree. To this end, preprocessed data from freely-available sources, such as ChEMBL, the Protein Data Bank, and the Center for Therapeutic Target Validation platform are integrated into KinMap and can easily be complemented by proprietary data. The value of KinMap will be exemplarily demonstrated for uncovering new therapeutic indications of known kinase inhibitors and for prioritizing kinases for drug development efforts. KinMap represents a new generation of kinome tree viewers which facilitates interactive exploration of the human kinome. KinMap enables generation of high-quality annotated images of the human kinome tree as well as exchange of kinome-related data in scientific communications. Furthermore, KinMap supports multiple input and output formats and recognizes alternative kinase names and links them to a unified naming scheme, which makes it a useful tool across different disciplines and applications. A web-service of KinMap is freely available at http://www.kinhub.org/kinmap/ .

  14. Requirements Development for the NASA Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.

    2003-01-01

    The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.

  15. A Coupled Multiphysics Approach for Simulating Induced Seismicity, Ground Acceleration and Structural Damage

    NASA Astrophysics Data System (ADS)

    Podgorney, Robert; Coleman, Justin; Wilkins, Amdrew; Huang, Hai; Veeraraghavan, Swetha; Xia, Yidong; Permann, Cody

    2017-04-01

    Numerical modeling has played an important role in understanding the behavior of coupled subsurface thermal-hydro-mechanical (THM) processes associated with a number of energy and environmental applications since as early as the 1970s. While the ability to rigorously describe all key tightly coupled controlling physics still remains a challenge, there have been significant advances in recent decades. These advances are related primarily to the exponential growth of computational power, the development of more accurate equations of state, improvements in the ability to represent heterogeneity and reservoir geometry, and more robust nonlinear solution schemes. The work described in this paper documents the development and linkage of several fully-coupled and fully-implicit modeling tools. These tools simulate: (1) the dynamics of fluid flow, heat transport, and quasi-static rock mechanics; (2) seismic wave propagation from the sources of energy release through heterogeneous material; and (3) the soil-structural damage resulting from ground acceleration. These tools are developed in Idaho National Laboratory's parallel Multiphysics Object Oriented Simulation Environment, and are integrated together using a global implicit approach. The governing equations are presented, the numerical approach for simultaneously solving and coupling the three coupling physics tools is discussed, and the data input and output methodology is outlined. An example is presented to demonstrate the capabilities of the coupled multiphysics approach. The example involves simulating a system conceptually similar to the geothermal development in Basel Switzerland, and the resultant induced seismicity, ground motion and structural damage is predicted.

  16. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  17. End of Asset Life Reinvestment Decision Support Tool (INFR2R11AT)

    EPA Science Inventory

    This “End of Asset Life” Reinvestment Decision-Support Tool is intended as a step by step guide for the asset management practitioner who faces the challenge of developing an investment strategy that represents the best integration of maintenance, operations, and capital investme...

  18. Tools for Measuring and Improving Performance.

    ERIC Educational Resources Information Center

    Jurow, Susan

    1993-01-01

    Explains the need for meaningful performance measures in libraries and the Total Quality Management (TQM) approach to data collection. Five tools representing different stages of a TQM inquiry are covered (i.e., the Shewhart Cycle, flowcharts, cause-and-effect diagrams, Pareto charts, and control charts), and benchmarking is addressed. (Contains…

  19. Physical Models of Schooling, the 'Ought' Question and Educational Change.

    ERIC Educational Resources Information Center

    Bauer, Norman J.

    This paper examines the methods used in designing school and classroom environments. The tools are labeled: (1) discipline-centered schooling; (2) empirical-naturalistic schooling; and (3) great works schooling. First, the outline endeavors to reveal the essential elements of the three tools that represent images, structures, or "maps" of…

  20. A tool for exploring the dynamics of innovative interventions for public health: the critical event card.

    PubMed

    Figueiro, Ana Claudia; de Araújo Oliveira, Sydia Rosana; Hartz, Zulmira; Couturier, Yves; Bernier, Jocelyne; do Socorro Machado Freire, Maria; Samico, Isabella; Medina, Maria Guadalupe; de Sa, Ronice Franco; Potvin, Louise

    2017-03-01

    Public health interventions are increasingly represented as complex systems. Research tools for capturing the dynamic of interventions processes, however, are practically non-existent. This paper describes the development and proof of concept process of an analytical tool, the critical event card (CEC), which supports the representation and analysis of complex interventions' evolution, based on critical events. Drawing on the actor-network theory (ANT), we developed and field-tested the tool using three innovative health interventions in northeastern Brazil. Interventions were aimed to promote health equity through intersectoral approaches; were engaged in participatory evaluation and linked to professional training programs. The CEC developing involve practitioners and researchers from projects. Proof of concept was based on document analysis, face-to-face interviews and focus groups. Analytical categories from CEC allow identifying and describing critical events as milestones in the evolution of complex interventions. Categories are (1) event description; (2) actants (human and non-human) involved; (3) interactions between actants; (4) mediations performed; (5) actions performed; (6) inscriptions produced; and (7) consequences for interventions. The CEC provides a tool to analyze and represent intersectoral internvetions' complex and dynamic evolution.

  1. The Representation of Object-Directed Action and Function Knowledge in the Human Brain.

    PubMed

    Chen, Quanjing; Garcea, Frank E; Mahon, Bradford Z

    2016-04-01

    The appropriate use of everyday objects requires the integration of action and function knowledge. Previous research suggests that action knowledge is represented in frontoparietal areas while function knowledge is represented in temporal lobe regions. Here we used multivoxel pattern analysis to investigate the representation of object-directed action and function knowledge while participants executed pantomimes of familiar tool actions. A novel approach for decoding object knowledge was used in which classifiers were trained on one pair of objects and then tested on a distinct pair; this permitted a measurement of classification accuracy over and above object-specific information. Region of interest (ROI) analyses showed that object-directed actions could be decoded in tool-preferring regions of both parietal and temporal cortex, while no independently defined tool-preferring ROI showed successful decoding of object function. However, a whole-brain searchlight analysis revealed that while frontoparietal motor and peri-motor regions are engaged in the representation of object-directed actions, medial temporal lobe areas in the left hemisphere are involved in the representation of function knowledge. These results indicate that both action and function knowledge are represented in a topographically coherent manner that is amenable to study with multivariate approaches, and that the left medial temporal cortex represents knowledge of object function. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Construct and Compare Gene Coexpression Networks with DAPfinder and DAPview.

    PubMed

    Skinner, Jeff; Kotliarov, Yuri; Varma, Sudhir; Mine, Karina L; Yambartsev, Anatoly; Simon, Richard; Huyen, Yentram; Morgun, Andrey

    2011-07-14

    DAPfinder and DAPview are novel BRB-ArrayTools plug-ins to construct gene coexpression networks and identify significant differences in pairwise gene-gene coexpression between two phenotypes. Each significant difference in gene-gene association represents a Differentially Associated Pair (DAP). Our tools include several choices of filtering methods, gene-gene association metrics, statistical testing methods and multiple comparison adjustments. Network results are easily displayed in Cytoscape. Analyses of glioma experiments and microarray simulations demonstrate the utility of these tools. DAPfinder is a new friendly-user tool for reconstruction and comparison of biological networks.

  3. Auditing Albaha University Network Security using in-house Developed Penetration Tool

    NASA Astrophysics Data System (ADS)

    Alzahrani, M. E.

    2018-03-01

    Network security becomes very important aspect in any enterprise/organization computer network. If important information of the organization can be accessed by anyone it may be used against the organization for further own interest. Thus, network security comes into it roles. One of important aspect of security management is security audit. Security performance of Albaha university network is relatively low (in term of the total controls outlined in the ISO 27002 security control framework). This paper proposes network security audit tool to address issues in Albaha University network. The proposed penetration tool uses Nessus and Metasploit tool to find out the vulnerability of a site. A regular self-audit using inhouse developed tool will increase the overall security and performance of Albaha university network. Important results of the penetration test are discussed.

  4. Development of a shared decision-making tool to assist patients and clinicians with decisions on oral anticoagulant treatment for atrial fibrillation.

    PubMed

    Kaiser, Karen; Cheng, Wendy Y; Jensen, Sally; Clayman, Marla L; Thappa, Andrew; Schwiep, Frances; Chawla, Anita; Goldberger, Jeffrey J; Col, Nananda; Schein, Jeff

    2015-12-01

    Decision aids (DAs) are increasingly used to operationalize shared decision-making (SDM) but their development is not often described. Decisions about oral anticoagulants (OACs) for atrial fibrillation (AF) involve a trade-off between lowering stroke risk and increasing OAC-associated bleeding risk, and consideration of how treatment affects lifestyle. The benefits and risks of OACs hinge upon a patient's risk factors for stroke and bleeding and how they value these outcomes. We present the development of a DA about AF that estimates patients' risks for stroke and bleeding and assesses their preferences for outcomes. Based on a literature review and expert discussions, we identified stroke and major bleeding risk prediction models and embedded them into risk assessment modules. We identified the most important factors in choosing OAC treatment (warfarin used as the default reference OAC) through focus group discussions with AF patients who had used warfarin and clinician interviews. We then designed preference assessment and introductory modules accordingly. We integrated these modules into a prototype AF SDM tool and evaluated its usability through interviews. Our tool included four modules: (1) introduction to AF and OAC treatment risks and benefits; (2) stroke risk assessment; (3) bleeding risk assessment; and (4) preference assessment. Interactive risk calculators estimated patient-specific stroke and bleeding risks; graphics were developed to communicate these risks. After cognitive interviews, the content was improved. The final AF tool calculates patient-specific risks and benefits of OAC treatment and couples these estimates with patient preferences to improve clinical decision-making. The AF SDM tool may help patients choose whether OAC treatment is best for them and represents a patient-centered, integrative approach to educate patients on the benefits and risks of OAC treatment. Future research is needed to evaluate this tool in a real-world setting. The development process presented can be applied to similar SDM tools.

  5. The BMPix and PEAK Tools: New Methods for Automated Laminae Recognition and Counting - Application to Glacial Varves From Antarctic Marine Sediment

    NASA Astrophysics Data System (ADS)

    Weber, M. E.; Reichelt, L.; Kuhn, G.; Thurow, J. W.; Ricken, W.

    2009-12-01

    We present software-based tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray-scale curves from images at ultrahigh (pixel) resolution. The PEAK tool uses the gray-scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a Gaussian smoothed gray-scale curve. The zero-crossing algorithm counts every positive and negative halfway-passage of the gray-scale curve through a wide moving average. Hence, the record is separated into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the gray-scale curve into its frequency components, before positive and negative passages are count. We applied the new methods successfully to tree rings and to well-dated and already manually counted marine varves from Saanich Inlet before we adopted the tools to rather complex marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that the laminations from three Weddell Sea sites represent true varves that were deposited on sediment ridges over several millennia during the last glacial maximum (LGM). There are apparently two seasonal layers of terrigenous composition, a coarser-grained bright layer, and a finer-grained dark layer. The new tools offer several advantages over previous tools. The counting procedures are based on a moving average generated from gray-scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Since PEAK associates counts with a specific depth, the thickness of each year or each season is also measured which is an important prerequisite for later spectral analysis. Since all information required to conduct the analysis is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.

  6. Simplified spacecraft vulnerability assessments at component level in early design phase at the European Space Agency's Concurrent Design Facility

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith

    2016-12-01

    During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.

  7. Changes in quality of life (WHOQOL-BREF) and addiction severity index (ASI) among participants in opioid substitution treatment (OST) in low and middle income countries: an international systematic review.

    PubMed

    Feelemyer, Jonathan P; Jarlais, Don C Des; Arasteh, Kamyar; Phillips, Benjamin W; Hagan, Holly

    2014-01-01

    Opioid substitution treatment (OST) can increase quality of life (WHOQOL-BREF) and reduce addiction severity index (ASI) scores among participants over time. OST program participants have noted that improvement in quality of life is one of the most important variables to their reduction in drug use. However, there is little systematic understanding of WHOQOL-BREF and ASI domain changes among OST participants in low and middle-income countries (LMIC). Utilizing PRISMA guidelines we conducted a systematic literature search to identify OST program studies documenting changes in WHOQOL-BREF or ASI domains for participants in buprenorphine or methadone programs in LMIC. Standardized mean differences for baseline and follow-up domain scores were compared along with relationships between domain scores, OST dosage, and length of follow-up. There were 13 OST program studies with 1801 participants from five countries eligible for inclusion in the review. Overall, statistically significant changes were noted in all four WHOQOL-BREF domain and four of the seven ASI domain scores (drug, psychological, legal, and family) documented in studies. Dosage of pharmacologic medication and length of follow-up did not affect changes in domain scores. WHOQOL-BREF and ASI domain scoring is a useful tool in measuring overall quality of life and levels of addiction among OST participants. Coupled with measurements of blood-borne infection, drug use, relapse, and overdose, WHOQOL-BREF and ASI represent equally important tools for evaluating the effects of OST over time and should be further developed as integrated tools in the evaluation of participants in LMIC. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Predicting the biological condition of streams: Use of geospatial indicators of natural and anthropogenic characteristics of watersheds

    USGS Publications Warehouse

    Carlisle, D.M.; Falcone, J.; Meador, M.R.

    2009-01-01

    We developed and evaluated empirical models to predict biological condition of wadeable streams in a large portion of the eastern USA, with the ultimate goal of prediction for unsampled basins. Previous work had classified (i.e., altered vs. unaltered) the biological condition of 920 streams based on a biological assessment of macroinvertebrate assemblages. Predictor variables were limited to widely available geospatial data, which included land cover, topography, climate, soils, societal infrastructure, and potential hydrologic modification. We compared the accuracy of predictions of biological condition class based on models with continuous and binary responses. We also evaluated the relative importance of specific groups and individual predictor variables, as well as the relationships between the most important predictors and biological condition. Prediction accuracy and the relative importance of predictor variables were different for two subregions for which models were created. Predictive accuracy in the highlands region improved by including predictors that represented both natural and human activities. Riparian land cover and road-stream intersections were the most important predictors. In contrast, predictive accuracy in the lowlands region was best for models limited to predictors representing natural factors, including basin topography and soil properties. Partial dependence plots revealed complex and nonlinear relationships between specific predictors and the probability of biological alteration. We demonstrate a potential application of the model by predicting biological condition in 552 unsampled basins across an ecoregion in southeastern Wisconsin (USA). Estimates of the likelihood of biological condition of unsampled streams could be a valuable tool for screening large numbers of basins to focus targeted monitoring of potentially unaltered or altered stream segments. ?? Springer Science+Business Media B.V. 2008.

  9. Business intelligence for the radiologist: making your data work for you.

    PubMed

    Cook, Tessa S; Nagy, Paul

    2014-12-01

    Although it remains absent from most programs today, business intelligence (BI) has become an integral part of modern radiology practice management. BI facilitates the transition away from lack of understanding about a system and the data it produces toward incrementally more sophisticated comprehension of what has happened, could happen, and should happen. The individual components that make up BI are common across industries and include data extraction and transformation, process analysis and improvement, outcomes measures, performance assessment, graphical dashboarding, alerting, workflow analysis, and scenario modeling. As in other fields, these components can be directly applied in radiology to improve workflow, throughput, safety, efficacy, outcomes, and patient satisfaction. When approaching the subject of BI in radiology, it is important to know what data are available in your various electronic medical records, as well as where and how they are stored. In addition, it is critical to verify that the data actually represent what you think they do. Finally, it is critical for success to identify the features and limitations of the BI tools you choose to use and to plan your practice modifications on the basis of collected data. It is equally important to remember that BI plays a critical role in continuous process improvement; whichever BI tools you choose should be flexible to grow and evolve with your practice. Published by Elsevier Inc.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clausen, Alison, E-mail: aliclausen@protocol.com.a; Vu, Hoang Hoa, E-mail: hoanghoavu@yahoo.co; Pedrono, Miguel, E-mail: pedrono@cirad.f

    Vietnam has one of the fastest growing economies in the world and has achieved significant socio-economic development in recent years. However this growth is placing increased pressure on an already depleted natural environment. Environmental impact assessment (EIA) is recognised by the Government and international organizations as an important tool in the management of the impacts of future development on the country's natural resource base. The Government's commitment to EIA has been demonstrated through the development and adoption of the Law on Environment Protection (Revised) in 2005 which sets out the requirements for EIA and which represents a major step inmore » the development of a robust legislative framework for EIA in Vietnam. The Law on Environment Protection (Revised) 2005 has now been operational for several years and we have undertaken an evaluation of the resulting EIA system in Vietnam. We argue that while significant improvements have been achieved in the EIA policy framework, an important gap remains between EIA theory and practice. We contend that the basis of the current EIA legislation is strong and that future developments of the EIA system in Vietnam should focus on improving capacity of EIA practitioners rather than further substantial legislative change. Such improvements would allow the Vietnamese EIA system to emerge as an effective and efficient tool for environmental management in Vietnam and as a model EIA framework for other developing countries.« less

  11. Murine fetal echocardiography.

    PubMed

    Kim, Gene H

    2013-02-15

    Transgenic mice displaying abnormalities in cardiac development and function represent a powerful tool for the understanding the molecular mechanisms underlying both normal cardiovascular function and the pathophysiological basis of human cardiovascular disease. Fetal and perinatal death is a common feature when studying genetic alterations affecting cardiac development. In order to study the role of genetic or pharmacologic alterations in the early development of cardiac function, ultrasound imaging of the live fetus has become an important tool for early recognition of abnormalities and longitudinal follow-up. Noninvasive ultrasound imaging is an ideal method for detecting and studying congenital malformations and the impact on cardiac function prior to death. It allows early recognition of abnormalities in the living fetus and the progression of disease can be followed in utero with longitudinal studies. Until recently, imaging of fetal mouse hearts frequently involved invasive methods. The fetus had to be sacrificed to perform magnetic resonance microscopy and electron microscopy or surgically delivered for transillumination microscopy. An application of high-frequency probes with conventional 2-D and pulsed-wave Doppler imaging has been shown to provide measurements of cardiac contraction and heart rates during embryonic development with databases of normal developmental changes now available. M-mode imaging further provides important functional data, although, the proper imaging planes are often difficult to obtain. High-frequency ultrasound imaging of the fetus has improved 2-D resolution and can provide excellent information on the early development of cardiac structures.

  12. Computational modeling of ion transport through nanopores.

    PubMed

    Modi, Niraj; Winterhalter, Mathias; Kleinekathöfer, Ulrich

    2012-10-21

    Nanoscale pores are ubiquitous in biological systems while artificial nanopores are being fabricated for an increasing number of applications. Biological pores are responsible for the transport of various ions and substrates between the different compartments of biological systems separated by membranes while artificial pores are aimed at emulating such transport properties. As an experimental method, electrophysiology has proven to be an important nano-analytical tool for the study of substrate transport through nanopores utilizing ion current measurements as a probe for the detection. Independent of the pore type, i.e., biological or synthetic, and objective of the study, i.e., to model cellular processes of ion transport or electrophysiological experiments, it has become increasingly important to understand the dynamics of ions in nanoscale confinements. To this end, numerical simulations have established themselves as an indispensable tool to decipher ion transport processes through biological as well as artificial nanopores. This article provides an overview of different theoretical and computational methods to study ion transport in general and to calculate ion conductance in particular. Potential new improvements in the existing methods and their applications are highlighted wherever applicable. Moreover, representative examples are given describing the ion transport through biological and synthetic nanopores as well as the high selectivity of ion channels. Special emphasis is placed on the usage of molecular dynamics simulations which already have demonstrated their potential to unravel ion transport properties at an atomic level.

  13. Quantum chemical modeling of enzymatic reactions: the case of histone lysine methyltransferase.

    PubMed

    Georgieva, Polina; Himo, Fahmi

    2010-06-01

    Quantum chemical cluster models of enzyme active sites are today an important and powerful tool in the study of various aspects of enzymatic reactivity. This methodology has been applied to a wide spectrum of reactions and many important mechanistic problems have been solved. Herein, we report a systematic study of the reaction mechanism of the histone lysine methyltransferase (HKMT) SET7/9 enzyme, which catalyzes the methylation of the N-terminal histone tail of the chromatin structure. In this study, HKMT SET7/9 serves as a representative case to examine the modeling approach for the important class of methyl transfer enzymes. Active site models of different sizes are used to evaluate the methodology. In particular, the dependence of the calculated energies on the model size, the influence of the dielectric medium, and the particular choice of the dielectric constant are discussed. In addition, we examine the validity of some technical aspects, such as geometry optimization in solvent or with a large basis set, and the use of different density functional methods. Copyright 2010 Wiley Periodicals, Inc.

  14. [Histocompatibility tests in a transplantation program].

    PubMed

    de-Leo-Cervantes, Claudia

    2005-01-01

    The importance of the role of the histocompatibility laboratory in solid organ transplantation is to perform HLA typing and determine the degree of HLA matching between recipient/donor. It is a useful tool to increase graft survival and decrease chronic rejection. HLA matching has a positive effect on kidney transplants and it has variable impact on other organ transplants. The crossmatch procedure is the most important test in a solid organ transplantation to evaluate the presence of recipient antibodies to antigens expressed on donor white cells. This test decreases the risk of hyperacute humoral rejection or early graft loss. Positive crossmatch is a contraindication for transplantation because it represents the existence of IgG recipient antibodies that will reath againts donor antigens. Antibody evaluation is important in donor-recipient selection and the responsability of the histocompatibility laboratory is to identify clinically relevant anti-donor HLA antibodies. This detection is useful to determine the degree of humoral alloimmunization, expressed as a percent panel reactive antibody (%PRA). This test also provides information about the antibody specificity and can be used for evaluate a patient's immune status providing a significant correlation in selecting donors.

  15. Measurement of Sexual Health in the U.S.: An Inventory of Nationally Representative Surveys and Surveillance Systems

    PubMed Central

    Ivankovich, Megan B.; Leichliter, Jami S.; Douglas, John M.

    2013-01-01

    Objectives To identify opportunities within nationally representative surveys and surveillance systems to measure indicators of sexual health, we reviewed and inventoried existing data systems that include variables relevant to sexual health. Methods We searched for U.S. nationally representative surveys and surveillance systems that provided individual-level sexual health data. We assessed the methods of each data system and catalogued them by their measurement of the following domains of sexual health: knowledge, communication, attitudes, service access and utilization, sexual behaviors, relationships, and adverse health outcomes. Results We identified 18 U.S.-focused, nationally representative data systems: six assessing the general population, seven focused on special populations, and five addressing health outcomes. While these data systems provide a rich repository of information from which to assess national measures of sexual health, they present several limitations. Most importantly, apart from data on service utilization, routinely gathered, national data are currently focused primarily on negative aspects of sexual health (e.g., risk behaviors and adverse health outcomes) rather than more positive attributes (e.g., healthy communication and attitudes, and relationship quality). Conclusion Nationally representative data systems provide opportunities to measure a broad array of domains of sexual health. However, current measurement gaps indicate the need to modify existing surveys, where feasible and appropriate, and develop new tools to include additional indicators that address positive domains of sexual health of the U.S. population across the life span. Such data can inform the development of effective policy actions, services, prevention programs, and resource allocation to advance sexual health. PMID:23450886

  16. Decipipes: Helping Students to "Get the Point"

    ERIC Educational Resources Information Center

    Moody, Bruce

    2011-01-01

    Decipipes are a representational model that can be used to help students develop conceptual understanding of decimal place value. They provide a non-standard tool for representing length, which in turn can be represented using conventional decimal notation. They are conceptually identical to Linear Arithmetic Blocks. This article reviews theory…

  17. Determination of representative elementary areas for soil redoximorphic features by digital image processing

    USDA-ARS?s Scientific Manuscript database

    Photography has been a welcome tool in documenting and conveying qualitative soil information. When coupled with image analysis software, the usefulness of digital cameras can be increased to advance the field of micropedology. The determination of a Representative Elementary Area (REA) still rema...

  18. Basins 4.0 Climate Assessment Tool (Cat): Supporting Documentation and User Manual (External Review Draft)

    EPA Science Inventory

    EPA has released of the draft document solely for the purpose of pre-dissemination peer review under applicable Information Quality Guidelines (IQGs). This document has not been formally disseminated by EPA. It does not represent and should not be construed to represent any Agenc...

  19. A general engineering scenario for concurrent engineering environments

    NASA Astrophysics Data System (ADS)

    Mucino, V. H.; Pavelic, V.

    The paper describes an engineering method scenario which categorizes the various activities and tasks into blocks seen as subjects which consume and produce data and information. These methods, tools, and associated utilities interact with other engineering tools by exchanging information in such a way that a relationship between customers and suppliers of engineering data is established clearly, while data exchange consistency is maintained throughout the design process. The events and data transactions are presented in the form of flowcharts in which data transactions represent the connection between the various bricks, which in turn represent the engineering activities developed for the particular task required in the concurrent engineering environment.

  20. Clinical Applications for EPs in the ICU.

    PubMed

    Koenig, Matthew A; Kaplan, Peter W

    2015-12-01

    In critically ill patients, evoked potential (EP) testing is an important tool for measuring neurologic function, signal transmission, and secondary processing of sensory information in real time. Evoked potential measures conduction along the peripheral and central sensory pathways with longer-latency potentials representing more complex thalamocortical and intracortical processing. In critically ill patients with limited neurologic exams, EP provides a window into brain function and the potential for recovery of consciousness. The most common EP modalities in clinical use in the intensive care unit include somatosensory evoked potentials, brainstem auditory EPs, and cortical event-related potentials. The primary indications for EP in critically ill patients are prognostication in anoxic-ischemic or traumatic coma, monitoring for neurologic improvement or decline, and confirmation of brain death. Somatosensory evoked potentials had become an important prognostic tool for coma recovery, especially in comatose survivors of cardiac arrest. In this population, the bilateral absence of cortical somatosensory evoked potentials has nearly 100% specificity for death or persistent vegetative state. Historically, EP has been regarded as a negative prognostic test, that is, the absence of cortical potentials is associated with poor outcomes while the presence cortical potentials are prognostically indeterminate. In recent studies, the presence of middle-latency and long-latency potentials as well as the amplitude of cortical potentials is more specific for good outcomes. Event-related potentials, particularly mismatch negativity of complex auditory patterns, is emerging as an important positive prognostic test in patients under comatose. Multimodality predictive algorithms that combine somatosensory evoked potentials, event-related potentials, and clinical and radiographic factors are gaining favor for coma prognostication.

  1. Airborne Turbulence Detection System Certification Tool Set

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2006-01-01

    A methodology and a corresponding set of simulation tools for testing and evaluating turbulence detection sensors has been presented. The tool set is available to industry and the FAA for certification of radar based airborne turbulence detection systems. The tool set consists of simulated data sets representing convectively induced turbulence, an airborne radar simulation system, hazard tables to convert the radar observable to an aircraft load, documentation, a hazard metric "truth" algorithm, and criteria for scoring the predictions. Analysis indicates that flight test data supports spatial buffers for scoring detections. Also, flight data and demonstrations with the tool set suggest the need for a magnitude buffer.

  2. Overcoming obstacles to the exchange of information between risk tools

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Cornford, Steven L.; Meshkat, Leila; Voss, Luke

    2005-01-01

    Our work to date in connecting risk tools hs had successes, but also has revealed there to be significant impediments to information exchange between them. These impediments stem from the well-known phenomenon of 'semantic dissonance' - mismatch between conceptual assumptions made by the separately developed tools. This issue represents a fundamental challenge that arises regardless of the mechanism of information exchange. This paper explains the issue and illustrates it with reference to our experiences to date connecting several risk tools. We motivate this work, present and discuss the solutions we have adopted to surmount these impediments, and the implications this work has for future efforts to integrate risk tools.

  3. Nanostructures as promising tools for delivery of antimicrobial peptides.

    PubMed

    Brandelli, A

    2012-07-01

    Antimicrobial peptides have been extensively investigated for their potential applications as therapeutics and food biopreservatives. The antimicrobial activity may be impaired by the susceptibility for proteolytic degradation and undesirable interactions of the antimicrobial peptide in the biological environment. Development of nanostructures for entrapment and delivery of antimicrobial peptides may represent an alternative to the direct application of these substances. Lipid nanovesicles have been developed for encapsulation of antimicrobial peptides. Phosphatidylcholine is often employed in liposome manufacture, which is mostly achieved by the thin-film hydration method. Nanofibers may allow different physical modes of drug loading, including direct adsorption on the nanofiber surface or the assembly of drug-loaded nanoparticles. Self-assembled peptides reveal attractive features as nanostructures for applications in drug delivery and promising as antimicrobial agent for treatment of brain infections. Magnetic nanoparticles and nanotubules are also potential structures for entrapment of antimicrobial peptides. Nanoparticles can be also chemically modified with specific cell surface ligands to enhance cell adhesion and site specific delivery. This article reviews the most important nanostructures as promising tools for peptide delivery systems.

  4. A fast and complete GEANT4 and ROOT Object-Oriented Toolkit: GROOT

    NASA Astrophysics Data System (ADS)

    Lattuada, D.; Balabanski, D. L.; Chesnevskaya, S.; Costa, M.; Crucillà, V.; Guardo, G. L.; La Cognata, M.; Matei, C.; Pizzone, R. G.; Romano, S.; Spitaleri, C.; Tumino, A.; Xu, Y.

    2018-01-01

    Present and future gamma-beam facilities represent a great opportunity to validate and evaluate the cross-sections of many photonuclear reactions at near-threshold energies. Monte Carlo (MC) simulations are very important to evaluate the reaction rates and to maximize the detection efficiency but, unfortunately, they can be very cputime-consuming and in some cases very hard to reproduce, especially when exploring near-threshold cross-section. We developed a software that makes use of the validated tracking GEANT4 libraries and the n-body event generator of ROOT in order to provide a fast, realiable and complete MC tool to be used for nuclear physics experiments. This tool is indeed intended to be used for photonuclear reactions at γ-beam facilities with ELISSA (ELI Silicon Strip Array), a new detector array under development at the Extreme Light Infrastructure - Nuclear Physics (ELI-NP). We discuss the results of MC simulations performed to evaluate the effects of the electromagnetic induced background, of the straggling due to the target thickness and of the resolution of the silicon detectors.

  5. Testing the reliability of δ13C of tree rings as climate tool in Pistacia khinjuk of Syrian desert

    NASA Astrophysics Data System (ADS)

    Caracuta, Valentina; Fiorentino, Girolamo

    2010-05-01

    High-resolution measures of past climate variations have been found to be of a critical importance for understanding anthropic resilience in drought-sensitive areas. The hills (Jebels) Abu-Rujmain and Abd al Aziz, with their 350 millimetre of rain and their steppe-forest spreading in the middle of the flat syrian desert, represent an unicum where analysing the effect of short term climate changes on pastoral communities. Thanks to a cooperation project in Syrian Arab republic with CIHEAM-Mediterranean Agronomic Institute of Bari -Italy (Rationalization of Ras El Ain Irrigation systems), we were allowed to carry out dendroclimate and carbon isotope analyses on tree-rings of local Pistacia khinjuk, a long-lived wood taxon, in order to test their reliability as tool for determining annual and seasonal rainfall/temperature variations. Comparison between the last 25 year rainfall and temperature values of the nearby meteorological stations and dendro-isotopes values have been carried out to point out which factor mostly affect the growth pattern of the trees in that particular area.

  6. Second harmonic generation microscopy of the living human cornea

    NASA Astrophysics Data System (ADS)

    Artal, Pablo; Ávila, Francisco; Bueno, Juan

    2018-02-01

    Second Harmonic Generation (SHG) microscopy provides high-resolution structural imaging of the corneal stroma without the need of labelling techniques. This powerful tool has never been applied to living human eyes so far. Here, we present a new compact SHG microscope specifically developed to image the structural organization of the corneal lamellae in living healthy human volunteers. The research prototype incorporates a long-working distance dry objective that allows non-contact three-dimensional SHG imaging of the cornea. Safety assessment and effectiveness of the system were firstly tested in ex-vivo fresh eyes. The maximum average power of the used illumination laser was 20 mW, more than 10 times below the maximum permissible exposure (according to ANSI Z136.1-2000). The instrument was successfully employed to obtain non-contact and non-invasive SHG of the living human eye within well-established light safety limits. This represents the first recording of in vivo SHG images of the human cornea using a compact multiphoton microscope. This might become an important tool in Ophthalmology for early diagnosis and tracking ocular pathologies.

  7. Data Assimilation and Propagation of Uncertainty in Multiscale Cardiovascular Simulation

    NASA Astrophysics Data System (ADS)

    Schiavazzi, Daniele; Marsden, Alison

    2015-11-01

    Cardiovascular modeling is the application of computational tools to predict hemodynamics. State-of-the-art techniques couple a 3D incompressible Navier-Stokes solver with a boundary circulation model and can predict local and peripheral hemodynamics, analyze the post-operative performance of surgical designs and complement clinical data collection minimizing invasive and risky measurement practices. The ability of these tools to make useful predictions is directly related to their accuracy in representing measured physiologies. Tuning of model parameters is therefore a topic of paramount importance and should include clinical data uncertainty, revealing how this uncertainty will affect the predictions. We propose a fully Bayesian, multi-level approach to data assimilation of uncertain clinical data in multiscale circulation models. To reduce the computational cost, we use a stable, condensed approximation of the 3D model build by linear sparse regression of the pressure/flow rate relationship at the outlets. Finally, we consider the problem of non-invasively propagating the uncertainty in model parameters to the resulting hemodynamics and compare Monte Carlo simulation with Stochastic Collocation approaches based on Polynomial or Multi-resolution Chaos expansions.

  8. Genetic tools for the investigation of Roseobacter clade bacteria

    PubMed Central

    2009-01-01

    Background The Roseobacter clade represents one of the most abundant, metabolically versatile and ecologically important bacterial groups found in marine habitats. A detailed molecular investigation of the regulatory and metabolic networks of these organisms is currently limited for many strains by missing suitable genetic tools. Results Conjugation and electroporation methods for the efficient and stable genetic transformation of selected Roseobacter clade bacteria including Dinoroseobacter shibae, Oceanibulbus indolifex, Phaeobacter gallaeciensis, Phaeobacter inhibens, Roseobacter denitrificans and Roseobacter litoralis were tested. For this purpose an antibiotic resistance screening was performed and suitable genetic markers were selected. Based on these transformation protocols stably maintained plasmids were identified. A plasmid encoded oxygen-independent fluorescent system was established using the flavin mononucleotide-based fluorescent protein FbFP. Finally, a chromosomal gene knockout strategy was successfully employed for the inactivation of the anaerobic metabolism regulatory gene dnr from D. shibae DFL12T. Conclusion A genetic toolbox for members of the Roseobacter clade was established. This provides a solid methodical basis for the detailed elucidation of gene regulatory and metabolic networks underlying the ecological success of this group of marine bacteria. PMID:20021642

  9. PanCoreGen - Profiling, detecting, annotating protein-coding genes in microbial genomes.

    PubMed

    Paul, Sandip; Bhardwaj, Archana; Bag, Sumit K; Sokurenko, Evgeni V; Chattopadhyay, Sujay

    2015-12-01

    A large amount of genomic data, especially from multiple isolates of a single species, has opened new vistas for microbial genomics analysis. Analyzing the pan-genome (i.e. the sum of genetic repertoire) of microbial species is crucial in understanding the dynamics of molecular evolution, where virulence evolution is of major interest. Here we present PanCoreGen - a standalone application for pan- and core-genomic profiling of microbial protein-coding genes. PanCoreGen overcomes key limitations of the existing pan-genomic analysis tools, and develops an integrated annotation-structure for a species-specific pan-genomic profile. It provides important new features for annotating draft genomes/contigs and detecting unidentified genes in annotated genomes. It also generates user-defined group-specific datasets within the pan-genome. Interestingly, analyzing an example-set of Salmonella genomes, we detect potential footprints of adaptive convergence of horizontally transferred genes in two human-restricted pathogenic serovars - Typhi and Paratyphi A. Overall, PanCoreGen represents a state-of-the-art tool for microbial phylogenomics and pathogenomics study. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Automated Fluid Feature Extraction from Transient Simulations

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1998-01-01

    In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.

  11. Optimising measurement of health-related characteristics of the built environment: Comparing data collected by foot-based street audits, virtual street audits and routine secondary data sources.

    PubMed

    Pliakas, Triantafyllos; Hawkesworth, Sophie; Silverwood, Richard J; Nanchahal, Kiran; Grundy, Chris; Armstrong, Ben; Casas, Juan Pablo; Morris, Richard W; Wilkinson, Paul; Lock, Karen

    2017-01-01

    The role of the neighbourhood environment in influencing health behaviours continues to be an important topic in public health research and policy. Foot-based street audits, virtual street audits and secondary data sources are widespread data collection methods used to objectively measure the built environment in environment-health association studies. We compared these three methods using data collected in a nationally representative epidemiological study in 17 British towns to inform future development of research tools. There was good agreement between foot-based and virtual audit tools. Foot based audits were superior for fine detail features. Secondary data sources measured very different aspects of the local environment that could be used to derive a range of environmental measures if validated properly. Future built environment research should design studies a priori using multiple approaches and varied data sources in order to best capture features that operate on different health behaviours at varying spatial scales. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Wide Awake and Ready to Move: 20 Years of Non-Viral Therapeutic Genome Engineering with the Sleeping Beauty Transposon System.

    PubMed

    Hodge, Russ; Narayanavari, Suneel A; Izsvák, Zsuzsanna; Ivics, Zoltán

    2017-10-01

    Gene therapies will only become a widespread tool in the clinical treatment of human diseases with the advent of gene transfer vectors that integrate genetic information stably, safely, effectively, and economically. Two decades after the discovery of the Sleeping Beauty (SB) transposon, it has been transformed into a vector system that is fulfilling these requirements. SB may well overcome some of the limitations associated with viral gene transfer vectors and transient non-viral gene delivery approaches that are being used in the majority of ongoing clinical trials. The SB system has achieved a high level of stable gene transfer and sustained transgene expression in multiple primary human somatic cell types, representing crucial steps that may permit its clinical use in the near future. This article reviews the most important aspects of SB as a tool for gene therapy, including aspects of its vectorization and genomic integration. As an illustration, the clinical development of the SB system toward gene therapy of age-related macular degeneration and cancer immunotherapy is highlighted.

  13. DengueME: A Tool for the Modeling and Simulation of Dengue Spatiotemporal Dynamics †

    PubMed Central

    de Lima, Tiago França Melo; Lana, Raquel Martins; de Senna Carneiro, Tiago Garcia; Codeço, Cláudia Torres; Machado, Gabriel Souza; Ferreira, Lucas Saraiva; de Castro Medeiros, Líliam César; Davis Junior, Clodoveu Augusto

    2016-01-01

    The prevention and control of dengue are great public health challenges for many countries, particularly since 2015, as other arboviruses have been observed to interact significantly with dengue virus. Different approaches and methodologies have been proposed and discussed by the research community. An important tool widely used is modeling and simulation, which help us to understand epidemic dynamics and create scenarios to support planning and decision making processes. With this aim, we proposed and developed DengueME, a collaborative open source platform to simulate dengue disease and its vector’s dynamics. It supports compartmental and individual-based models, implemented over a GIS database, that represent Aedes aegypti population dynamics, human demography, human mobility, urban landscape and dengue transmission mediated by human and mosquito encounters. A user-friendly graphical interface was developed to facilitate model configuration and data input, and a library of models was developed to support teaching-learning activities. DengueME was applied in study cases and evaluated by specialists. Other improvements will be made in future work, to enhance its extensibility and usability. PMID:27649226

  14. 21st century tools to prioritize contaminants for monitoring and ...

    EPA Pesticide Factsheets

    The webinar focused on ways that ToxCast high throughput screening data and the adverse outcome pathway framework, under development in the CSS program, can be used to prioritize environmental contaminants for monitoring and management. The webinar focused on ways that ToxCast high throughput screening data and the adverse outcome pathway framework, under development in the CSS program, can be used to prioritize environmental contaminants for monitoring and management. The work presented focused on case studies conducted in Region 8, in collaboration with EPA Region 8 and NEIC, as well as other federal (USGS, US FWS) and regional partners (Northern Colorado Plateau Network). The Consortium for Research and Education on Emerging Contaminants (CREEC) is a grass-roots 501(c)(3) non-profit organization comprised of world-class scientists and stakeholders with a shared interest in the source, fate, and physiological effects of contaminants of emerging concern (www.creec.net). As such, they represent an important group of stakeholders with an interest in applying the data, approaches, and tools that are being developed by the CSS program.

  15. Applying 21st century tools to watersheds of the western US ...

    EPA Pesticide Factsheets

    The webinar focused on ways that ToxCast high throughput screening data and the adverse outcome pathway framework, under development in the CSS program, can be used to prioritize environmental contaminants for monitoring and management. The webinar focused on ways that ToxCast high throughput screening data and the adverse outcome pathway framework, under development in the CSS program, can be used to prioritize environmental contaminants for monitoring and management. The work presented focused on case studies conducted in Region 8, in collaboration with EPA Region 8 and NEIC, as well as other federal (USGS, US FWS) and regional partners (Northern Colorado Plateau Network). The Consortium for Research and Education on Emerging Contaminants (CREEC) is a grass-roots 501(c)(3) non-profit organization comprised of world-class scientists and stakeholders with a shared interest in the source, fate, and physiological effects of contaminants of emerging concern (www.creec.net). As such, they represent an important group of stakeholders with an interest in applying the data, approaches, and tools that are being developed by the CSS program.

  16. Multilevel Complex Networks and Systems

    NASA Astrophysics Data System (ADS)

    Caldarelli, Guido

    2014-03-01

    Network theory has been a powerful tool to model isolated complex systems. However, the classical approach does not take into account the interactions often present among different systems. Hence, the scientific community is nowadays concentrating the efforts on the foundations of new mathematical tools for understanding what happens when multiple networks interact. The case of economic and financial networks represents a paramount example of multilevel networks. In the case of trade, trade among countries the different levels can be described by the different granularity of the trading relations. Indeed, we have now data from the scale of consumers to that of the country level. In the case of financial institutions, we have a variety of levels at the same scale. For example one bank can appear in the interbank networks, ownership network and cds networks in which the same institution can take place. In both cases the systemically important vertices need to be determined by different procedures of centrality definition and community detection. In this talk I will present some specific cases of study related to these topics and present the regularities found. Acknowledged support from EU FET Project ``Multiplex'' 317532.

  17. [Politics as a tool in National Health System transformation].

    PubMed

    Dávila Torres, Javier

    2012-01-01

    The politics as an activity oriented to the decision making process, seeks to achieve specific objectives, and it is a fundamental tool for the transformation of the National Health System (NHS). It is important to point out that there are different elements, interest and participants that take part in the design and implementation of these policies. Therefore, it should be considered the presence of the health care institutions in the development of the health policies, as well as the participation of the Congress where each political party presents and defends their proposals, negotiate the approval and assignation of the financial budget, among others. Nowadays, there are elements with a relevant presence on these policies and in the transformation process of the NHS such as the media and laboral force represented by the unions. Finally, some general statements are expressed to contribute with the advances in the integration process for a stronger NHS. This should consider the economic, demographic and social changes in the country; furthermore it should focus on universal coverage and provision of a better health care for the Mexican population.

  18. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    NASA Astrophysics Data System (ADS)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  19. XML technologies for the Omaha System: a data model, a Java tool and several case studies supporting home healthcare.

    PubMed

    Vittorini, Pierpaolo; Tarquinio, Antonietta; di Orio, Ferdinando

    2009-03-01

    The eXtensible markup language (XML) is a metalanguage which is useful to represent and exchange data between heterogeneous systems. XML may enable healthcare practitioners to document, monitor, evaluate, and archive medical information and services into distributed computer environments. Therefore, the most recent proposals on electronic health records (EHRs) are usually based on XML documents. Since none of the existing nomenclatures were specifically developed for use in automated clinical information systems, but were adapted to such use, numerous current EHRs are organized as a sequence of events, each represented through codes taken from international classification systems. In nursing, a hierarchically organized problem-solving approach is followed, which hardly couples with the sequential organization of such EHRs. Therefore, the paper presents an XML data model for the Omaha System taxonomy, which is one of the most important international nomenclatures used in the home healthcare nursing context. Such a data model represents the formal definition of EHRs specifically developed for nursing practice. Furthermore, the paper delineates a Java application prototype which is able to manage such documents, shows the possibility to transform such documents into readable web pages, and reports several case studies, one currently managed by the home care service of a Health Center in Central Italy.

  20. Representing nursing guideline with unified modeling language to facilitate development of a computer system: a case study.

    PubMed

    Choi, Jeeyae; Choi, Jeungok E

    2014-01-01

    To provide best recommendations at the point of care, guidelines have been implemented in computer systems. As a prerequisite, guidelines are translated into a computer-interpretable guideline format. Since there are no specific tools to translate nursing guidelines, only a few nursing guidelines are translated and implemented in computer systems. Unified modeling language (UML) is a software writing language and is known to well and accurately represent end-users' perspective, due to the expressive characteristics of the UML. In order to facilitate the development of computer systems for nurses' use, the UML was used to translate a paper-based nursing guideline, and its ease of use and the usefulness were tested through a case study of a genetic counseling guideline. The UML was found to be a useful tool to nurse informaticians and a sufficient tool to model a guideline in a computer program.

Top