Sample records for method tools specification

  1. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  2. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  3. Bombings specific triage (Bost Tool) tool and its application by healthcare professionals

    PubMed Central

    Sanjay, Jaiswal; Ankur, Verma; Tamorish, Kole

    2015-01-01

    BACKGROUND: Bombing is a unique incident which produces unique patterns, multiple and occult injuries. Death often is a result of combined blast, ballistic and thermal effect injuries. Various natures of injury, self referrals and arrival by private transportation may lead to “wrong triage” in the emergency department. In India there has been an increase in incidence of bombing in the last 15 years. There is no documented triage tool from the National Disaster Management Authority of India for Bombings. We have tried to develop an ideal bombing specific triage tool which will guide the right patients to the right place at the right time and save more lives. METHODS: There are three methods of studying the triage tool: 1) real disaster; 2) mock drill; 3) table top exercise. In this study, a table top exercise method was selected. There are two groups, each consisting of an emergency physician, a nurse and a paramedic. RESULTS: By using the proportion test, we found that correct triaging was significantly different (P=0.005) in proportion between the two groups: group B (80%) with triage tool performed better in triaging the bomb blast victims than group A (50%) without the bombing specific triage tool performed. CONCLUSION: Development of bombing specific triage tool can reduce under triaging. PMID:26693264

  4. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  5. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  6. Bombings specific triage (Bost Tool) tool and its application by healthcare professionals.

    PubMed

    Sanjay, Jaiswal; Ankur, Verma; Tamorish, Kole

    2015-01-01

    Bombing is a unique incident which produces unique patterns, multiple and occult injuries. Death often is a result of combined blast, ballistic and thermal effect injuries. Various natures of injury, self referrals and arrival by private transportation may lead to "wrong triage" in the emergency department. In India there has been an increase in incidence of bombing in the last 15 years. There is no documented triage tool from the National Disaster Management Authority of India for Bombings. We have tried to develop an ideal bombing specific triage tool which will guide the right patients to the right place at the right time and save more lives. There are three methods of studying the triage tool: 1) real disaster; 2) mock drill; 3) table top exercise. In this study, a table top exercise method was selected. There are two groups, each consisting of an emergency physician, a nurse and a paramedic. By using the proportion test, we found that correct triaging was significantly different (P=0.005) in proportion between the two groups: group B (80%) with triage tool performed better in triaging the bomb blast victims than group A (50%) without the bombing specific triage tool performed. Development of bombing specific triage tool can reduce under triaging.

  7. In silico site-directed mutagenesis informs species-specific predictions of chemical susceptibility derived from the Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool

    EPA Science Inventory

    The Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to address needs for rapid, cost effective methods of species extrapolation of chemical susceptibility. Specifically, the SeqAPASS tool compares the primary sequence (Level 1), functiona...

  8. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.

  9. Ontology-based configuration of problem-solving methods and generation of knowledge-acquisition tools: application of PROTEGE-II to protocol-based decision support.

    PubMed

    Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A

    1995-06-01

    PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.

  10. Sentiment Analysis of Health Care Tweets: Review of the Methods Used.

    PubMed

    Gohil, Sunir; Vuik, Sabine; Darzi, Ara

    2018-04-23

    Twitter is a microblogging service where users can send and read short 140-character messages called "tweets." There are several unstructured, free-text tweets relating to health care being shared on Twitter, which is becoming a popular area for health care research. Sentiment is a metric commonly used to investigate the positive or negative opinion within these messages. Exploring the methods used for sentiment analysis in Twitter health care research may allow us to better understand the options available for future research in this growing field. The first objective of this study was to understand which tools would be available for sentiment analysis of Twitter health care research, by reviewing existing studies in this area and the methods they used. The second objective was to determine which method would work best in the health care settings, by analyzing how the methods were used to answer specific health care questions, their production, and how their accuracy was analyzed. A review of the literature was conducted pertaining to Twitter and health care research, which used a quantitative method of sentiment analysis for the free-text messages (tweets). The study compared the types of tools used in each case and examined methods for tool production, tool training, and analysis of accuracy. A total of 12 papers studying the quantitative measurement of sentiment in the health care setting were found. More than half of these studies produced tools specifically for their research, 4 used open source tools available freely, and 2 used commercially available software. Moreover, 4 out of the 12 tools were trained using a smaller sample of the study's final data. The sentiment method was trained against, on an average, 0.45% (2816/627,024) of the total sample data. One of the 12 papers commented on the analysis of accuracy of the tool used. Multiple methods are used for sentiment analysis of tweets in the health care setting. These range from self-produced basic categorizations to more complex and expensive commercial software. The open source and commercial methods are developed on product reviews and generic social media messages. None of these methods have been extensively tested against a corpus of health care messages to check their accuracy. This study suggests that there is a need for an accurate and tested tool for sentiment analysis of tweets trained using a health care setting-specific corpus of manually annotated tweets first. ©Sunir Gohil, Sabine Vuik, Ara Darzi. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 23.04.2018.

  11. Experience Using Formal Methods for Specifying a Multi-Agent System

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.

  12. Real-time fluorescence loop mediated isothermal amplification for the diagnosis of malaria.

    PubMed

    Lucchi, Naomi W; Demas, Allison; Narayanan, Jothikumar; Sumari, Deborah; Kabanywanyi, Abdunoor; Kachur, S Patrick; Barnwell, John W; Udhayakumar, Venkatachalam

    2010-10-29

    Molecular diagnostic methods can complement existing tools to improve the diagnosis of malaria. However, they require good laboratory infrastructure thereby restricting their use to reference laboratories and research studies. Therefore, adopting molecular tools for routine use in malaria endemic countries will require simpler molecular platforms. The recently developed loop-mediated isothermal amplification (LAMP) method is relatively simple and can be improved for better use in endemic countries. In this study, we attempted to improve this method for malaria diagnosis by using a simple and portable device capable of performing both the amplification and detection (by fluorescence) of LAMP in one platform. We refer to this as the RealAmp method. Published genus-specific primers were used to test the utility of this method. DNA derived from different species of malaria parasites was used for the initial characterization. Clinical samples of P. falciparum were used to determine the sensitivity and specificity of this system compared to microscopy and a nested PCR method. Additionally, directly boiled parasite preparations were compared with a conventional DNA isolation method. The RealAmp method was found to be simple and allowed real-time detection of DNA amplification. The time to amplification varied but was generally less than 60 minutes. All human-infecting Plasmodium species were detected. The sensitivity and specificity of RealAmp in detecting P. falciparum was 96.7% and 91.7% respectively, compared to microscopy and 98.9% and 100% respectively, compared to a standard nested PCR method. In addition, this method consistently detected P. falciparum from directly boiled blood samples. This RealAmp method has great potential as a field usable molecular tool for diagnosis of malaria. This tool can provide an alternative to conventional PCR based diagnostic methods for field use in clinical and operational programs.

  13. Direct glycan structure determination of intact N-linked glycopeptides by low-energy collision-induced dissociation tandem mass spectrometry and predicted spectral library searching.

    PubMed

    Pai, Pei-Jing; Hu, Yingwei; Lam, Henry

    2016-08-31

    Intact glycopeptide MS analysis to reveal site-specific protein glycosylation is an important frontier of proteomics. However, computational tools for analyzing MS/MS spectra of intact glycopeptides are still limited and not well-integrated into existing workflows. In this work, a new computational tool which combines the spectral library building/searching tool, SpectraST (Lam et al. Nat. Methods2008, 5, 873-875), and the glycopeptide fragmentation prediction tool, MassAnalyzer (Zhang et al. Anal. Chem.2010, 82, 10194-10202) for intact glycopeptide analysis has been developed. Specifically, this tool enables the determination of the glycan structure directly from low-energy collision-induced dissociation (CID) spectra of intact glycopeptides. Given a list of possible glycopeptide sequences as input, a sample-specific spectral library of MassAnalyzer-predicted spectra is built using SpectraST. Glycan identification from CID spectra is achieved by spectral library searching against this library, in which both m/z and intensity information of the possible fragmentation ions are taken into consideration for improved accuracy. We validated our method using a standard glycoprotein, human transferrin, and evaluated its potential to be used in site-specific glycosylation profiling of glycoprotein datasets from LC-MS/MS. In addition, we further applied our method to reveal, for the first time, the site-specific N-glycosylation profile of recombinant human acetylcholinesterase expressed in HEK293 cells. For maximum usability, SpectraST is developed as part of the Trans-Proteomic Pipeline (TPP), a freely available and open-source software suite for MS data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Assessment of olfactory function after traumatic brain injury: comparison of single odour tool with detailed assessment tool.

    PubMed

    Humphries, Thomas; Singh, Rajiv

    2018-01-01

    Olfactory disturbance (OD) is common after traumatic brain injury (TBI). Screening for OD can be performed by several different methods. While odour identification tools are considered more accurate, they are time consuming. The predominant method in clinical practice remains the use of a single odour. This study aimed to compare a brief single-odour identification tool (BSOIT) with a more detailed 12-odour assessment tool. One hundred seventy consecutive patients with TBI had their olfaction assessed using BSOIT and a 12-item tool at a single time point. The sensitivity and specificity of the BSOIT were calculated. The sensitivity and specificity of the BSOIT as compared to the Burghart tool were 57.5% and 100%, respectively, for all ODs (anosmia and hyposmia). The sensitivity and specificity for anosmia only were 93.5% and 96.7%, respectively. For the two tools, the Cohen's kappa coefficient showed moderate agreement when both anosmia and hyposmia were considered (k = 0.619, p < 0.001) but a very strong agreement when only anosmia was considered (k = 0.844, p < 0.001). For both the tools, anosmia had a significant association with TBI severity (p < 0.001). However, hyposmia showed no such association. The BSOIT is very effective at identifying anosmia but not hyposmia, producing comparable results to a more detailed test. It can be effective in clinical practice and takes considerably less time.

  15. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    NASA Astrophysics Data System (ADS)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.

  16. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  17. Robust detection of rare species using environmental DNA: The importance of primer specificity

    Treesearch

    Taylor M. Wilcox; Kevin S. McKelvey; Michael K. Young; Stephen F. Jane; Winsor H. Lowe; Andrew R. Whiteley; Michael K. Schwartz

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probebased chemistries may represent a particularly powerful tool because of the method's sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence...

  18. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    PubMed

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  19. A Sensor-Based Method for Diagnostics of Machine Tool Linear Axes.

    PubMed

    Vogl, Gregory W; Weiss, Brian A; Donmez, M Alkan

    2015-01-01

    A linear axis is a vital subsystem of machine tools, which are vital systems within many manufacturing operations. When installed and operating within a manufacturing facility, a machine tool needs to stay in good condition for parts production. All machine tools degrade during operations, yet knowledge of that degradation is illusive; specifically, accurately detecting degradation of linear axes is a manual and time-consuming process. Thus, manufacturers need automated and efficient methods to diagnose the condition of their machine tool linear axes without disruptions to production. The Prognostics and Health Management for Smart Manufacturing Systems (PHM4SMS) project at the National Institute of Standards and Technology (NIST) developed a sensor-based method to quickly estimate the performance degradation of linear axes. The multi-sensor-based method uses data collected from a 'sensor box' to identify changes in linear and angular errors due to axis degradation; the sensor box contains inclinometers, accelerometers, and rate gyroscopes to capture this data. The sensors are expected to be cost effective with respect to savings in production losses and scrapped parts for a machine tool. Numerical simulations, based on sensor bandwidth and noise specifications, show that changes in straightness and angular errors could be known with acceptable test uncertainty ratios. If a sensor box resides on a machine tool and data is collected periodically, then the degradation of the linear axes can be determined and used for diagnostics and prognostics to help optimize maintenance, production schedules, and ultimately part quality.

  20. A Sensor-Based Method for Diagnostics of Machine Tool Linear Axes

    PubMed Central

    Vogl, Gregory W.; Weiss, Brian A.; Donmez, M. Alkan

    2017-01-01

    A linear axis is a vital subsystem of machine tools, which are vital systems within many manufacturing operations. When installed and operating within a manufacturing facility, a machine tool needs to stay in good condition for parts production. All machine tools degrade during operations, yet knowledge of that degradation is illusive; specifically, accurately detecting degradation of linear axes is a manual and time-consuming process. Thus, manufacturers need automated and efficient methods to diagnose the condition of their machine tool linear axes without disruptions to production. The Prognostics and Health Management for Smart Manufacturing Systems (PHM4SMS) project at the National Institute of Standards and Technology (NIST) developed a sensor-based method to quickly estimate the performance degradation of linear axes. The multi-sensor-based method uses data collected from a ‘sensor box’ to identify changes in linear and angular errors due to axis degradation; the sensor box contains inclinometers, accelerometers, and rate gyroscopes to capture this data. The sensors are expected to be cost effective with respect to savings in production losses and scrapped parts for a machine tool. Numerical simulations, based on sensor bandwidth and noise specifications, show that changes in straightness and angular errors could be known with acceptable test uncertainty ratios. If a sensor box resides on a machine tool and data is collected periodically, then the degradation of the linear axes can be determined and used for diagnostics and prognostics to help optimize maintenance, production schedules, and ultimately part quality. PMID:28691039

  1. Mass Spectrometry Based Ultrasensitive DNA Methylation Profiling Using Target Fragmentation Assay.

    PubMed

    Lin, Xiang-Cheng; Zhang, Ting; Liu, Lan; Tang, Hao; Yu, Ru-Qin; Jiang, Jian-Hui

    2016-01-19

    Efficient tools for profiling DNA methylation in specific genes are essential for epigenetics and clinical diagnostics. Current DNA methylation profiling techniques have been limited by inconvenient implementation, requirements of specific reagents, and inferior accuracy in quantifying methylation degree. We develop a novel mass spectrometry method, target fragmentation assay (TFA), which enable to profile methylation in specific sequences. This method combines selective capture of DNA target from restricted cleavage of genomic DNA using magnetic separation with MS detection of the nonenzymatic hydrolysates of target DNA. This method is shown to be highly sensitive with a detection limit as low as 0.056 amol, allowing direct profiling of methylation using genome DNA without preamplification. Moreover, this method offers a unique advantage in accurately determining DNA methylation level. The clinical applicability was demonstrated by DNA methylation analysis using prostate tissue samples, implying the potential of this method as a useful tool for DNA methylation profiling in early detection of related diseases.

  2. Informatics Tools and Methods to Enhance U.S. Cancer Surveillance Research, UG3/UH3 | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    The goal of this Funding Opportunity Announcement (FOA) is to advance surveillance science by supporting the development of new and innovative tools and methods for more efficient, detailed, timely, and accurate data collection by cancer registries. Specifically, the FOA seeks applications for projects to develop, adapt, apply, scale-up, and validate tools and methods to improve the collection and integration cancer registry data and to expand the data items collected. Population-based central cancer registries (a partnership must involve at least two different registries).

  3. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  4. Tools4miRs – one place to gather all the tools for miRNA analysis

    PubMed Central

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  5. Tools4miRs - one place to gather all the tools for miRNA analysis.

    PubMed

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-09-01

    MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  6. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    EPA Pesticide Factsheets

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  7. Near Infrared Imaging as a Diagnostic Tool for Detecting Enamel Demineralization: An in vivo Study

    NASA Astrophysics Data System (ADS)

    Lucas, Seth Adam

    Background and Objectives: For decades there has been an effort to develop alternative optical methods of imaging dental decay utilizing non-ionizing radiation methods. The purpose of this in-vivo study was to demonstrate whether NIR can be used as a diagnostic tool to evaluate dental caries and to compare the sensitivity and specificity of this method with that of conventional methods, including bitewing x-rays and visual inspection. Materials and Methods: 31 test subjects (n=31) from the UCSF orthodontic clinic undergoing orthodontic treatment with planned premolar extractions were recruited. Calibrated examiners performed caries detection examinations using conventional methods: bitewing radiographs and visual inspection. These findings were compared with the results from NIR examinations: transillumination and reflectance. To confirm the results found in the two different detection methods, a gold standard was used. After teeth were extracted, polarized light microscopy and transverse microradiography were performed. Results: A total of 87 premolars were used in the study. NIR identified the occlusal lesions with a sensitivity of 71% and a specificity of 77%, whereas, the visual examination had a sensitivity of only 40% and a specifity of 39%. For interproximal lesions halfway to DEJ, specificity remained constant, but sensitivity improved to 100% for NIR and 75% for x-rays. Conclusions: The results of this preliminary study demonstrate that NIR is just as effective at detecting enamel interproximal lesions as standard dental x-rays. NIR was more effective at detecting occlusal lesions than visual examination alone. NIR shows promise as an alternative diagnostic tool to the conventional methods of x-rays and visual examination and provides a non-ionizing radiation technique.

  8. Commercial Molecular Tests for Fungal Diagnosis from a Practical Point of View.

    PubMed

    Lackner, Michaela; Lass-Flörl, Cornelia

    2017-01-01

    The increasing interest in molecular diagnostics is a result of tremendously improved knowledge on fungal infections in the past 20 years and the rapid development of new methods, in particular polymerase chain reaction. High expectations have been placed on molecular diagnostics, and the number of laboratories now using the relevant technology is rapidly increasing-resulting in an obvious need for standardization and definition of laboratory organization. In the past 10 years, multiple new molecular tools were marketed for the detection of DNA, antibodies, cell wall components, or other antigens. In contrast to classical culture methods, molecular methods do not detect a viable organisms, but only molecules which indicate its presence; this can be nucleic acids, cell components (antigens), or antibodies (Fig. 1). In this chapter, an overview is provided on commercially available detection tools, their strength and how to use them. A main focus is laid on providing tips and tricks that make daily life easier. We try to focus and mention methodical details which are not highlighted in the manufacturer's instructions of these test kits, but are based on our personal experience in the laboratory. Important to keep in mind is that molecular tools cannot replace culture, microscopy, or a critical view on patients' clinical history, signs, and symptoms, but provide a valuable add on tool. Diagnosis should not be based solely on a molecular test, but molecular tools might deliver an important piece of information that helps matching the diagnostic puzzle to a diagnosis, in particular as few tests are in vitro diagnostic tests (IVD) or only part of the whole test carries the IVD certificate (e.g., DNA extraction is often not included). Please be aware that the authors do not claim to provide a complete overview on all commercially available diagnostic assays being currently marketed for fungal detection, as those are subject to constant change. A main focus is put on commonly used panfungal assays and pathogen-specific assays, including Aspergillus-specific, Candida-specific, Cryptococcus specific, Histoplasma-specific, and Pneumocystis-specific assays. Assays are categorized according to their underlying principle in either antigen-detecting or antibody-detecting or DNA-detecting (Fig. 1). Other non-DNA-detecting nucleic acid methods such as FISH and PNA FISH are not summarized in this chapter and an overview on test performance, common false positives, and the clinical evaluation of commercial tests in studies is provided already in a previous book series by Javier Yugueros Marcos and David H. Pincus (Marcos and Pincus, Methods Mol Biol 968:25-54, 2013).

  9. Validation of a Nutrition Screening Tool for Pediatric Patients with Cystic Fibrosis.

    PubMed

    Souza Dos Santos Simon, Miriam Isabel; Forte, Gabriele Carra; da Silva Pereira, Juliane; da Fonseca Andrade Procianoy, Elenara; Drehmer, Michele

    2016-05-01

    In cystic fibrosis (CF), nutrition diagnosis is of critical relevance because the early identification of nutrition-related compromise enables early, adequate intervention and, consequently, influences patient prognosis. Up to now, there has not been a validated nutrition screening tool that takes into consideration clinical variables. To validate a specific nutritional risk screening tool for patients with CF based on clinical variables, anthropometric parameters, and dietary intake. Cross-sectional study. The nutrition screening tool was compared with a risk screening tool proposed by McDonald and the Cystic Fibrosis Foundation criteria. Patients aged 6 to 18 years, with a diagnosis of CF confirmed by two determinations of elevated chloride level in sweat (sweat test) and/or by identification of two CF-associated genetic mutations who were receiving follow-up care through the outpatient clinic of a Cystic Fibrosis Treatment Center. Earlier identification of nutritional risk in CF patients aged 6 to 18 years when a new screening tool was applied. Agreement among the tested methods was assessed by means of the kappa coefficient for categorical variables. Sensitivity, specificity, and accuracy values were calculated. The significance level was set at 5% (P<0.05). Statistical analyses were carried out in PASW Statistics for Windows version 18.0 (2009, SPSS Inc). Eighty-two patients (49% men, aged 6 to 18 years) were enrolled in the study. The agreement between the proposed screening tool and the tool for screening nutritional risk for CF by the McDonald method was good (κ=0.804; P<0.001) and the sensitivity and specificity was 85% and 95%, respectively. Agreement with the Cystic Fibrosis Foundation criteria was lower (κ=0.418; P<0.001), and the sensitivity and specificity were both 72%. The proposed screening tool with defined clinical variables promotes earlier identification of nutritional risk in pediatric patients with CF. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  10. Multicenter Validation of a Customizable Scoring Tool for Selection of Trainees for a Residency or Fellowship Program. The EAST-IST Study.

    PubMed

    Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D

    2017-04-01

    Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.

  11. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  12. Development of a PubMed Based Search Tool for Identifying Sex and Gender Specific Health Literature

    PubMed Central

    Song, Michael M.; Simonsen, Cheryl K.; Wilson, Joanna D.

    2016-01-01

    Abstract Background: An effective literature search strategy is critical to achieving the aims of Sex and Gender Specific Health (SGSH): to understand sex and gender differences through research and to effectively incorporate the new knowledge into the clinical decision making process to benefit both male and female patients. The goal of this project was to develop and validate an SGSH literature search tool that is readily and freely available to clinical researchers and practitioners. Methods: PubMed, a freely available search engine for the Medline database, was selected as the platform to build the SGSH literature search tool. Combinations of Medical Subject Heading terms, text words, and title words were evaluated for optimal specificity and sensitivity. The search tool was then validated against reference bases compiled for two disease states, diabetes and stroke. Results: Key sex and gender terms and limits were bundled to create a search tool to facilitate PubMed SGSH literature searches. During validation, the search tool retrieved 50 of 94 (53.2%) stroke and 62 of 95 (65.3%) diabetes reference articles selected for validation. A general keyword search of stroke or diabetes combined with sex difference retrieved 33 of 94 (35.1%) stroke and 22 of 95 (23.2%) diabetes reference base articles, with lower sensitivity and specificity for SGSH content. Conclusions: The Texas Tech University Health Sciences Center SGSH PubMed Search Tool provides higher sensitivity and specificity to sex and gender specific health literature. The tool will facilitate research, clinical decision-making, and guideline development relevant to SGSH. PMID:26555409

  13. Use of next generation sequencing data to develop a qPCR method for specific detection of EU-unauthorized genetically modified Bacillus subtilis overproducing riboflavin.

    PubMed

    Barbau-Piednoir, Elodie; De Keersmaecker, Sigrid C J; Delvoye, Maud; Gau, Céline; Philipp, Patrick; Roosens, Nancy H

    2015-11-11

    Recently, the presence of an unauthorized genetically modified (GM) Bacillus subtilis bacterium overproducing vitamin B2 in a feed additive was notified by the Rapid Alert System for Food and Feed (RASFF). This has demonstrated that a contamination by a GM micro-organism (GMM) may occur in feed additives and has confronted for the first time,the enforcement laboratories with this type of RASFF. As no sequence information of this GMM nor any specific detection or identification method was available, Next GenerationSequencing (NGS) was used to generate sequence information. However, NGS data analysis often requires appropriate tools, involving bioinformatics expertise which is not alwayspresent in the average enforcement laboratory. This hampers the use of this technology to rapidly obtain critical sequence information in order to be able to develop a specific qPCRdetection method. Data generated by NGS were exploited using a simple BLAST approach. A TaqMan® qPCR method was developed and tested on isolated bacterial strains and on the feed additive directly. In this study, a very simple strategy based on the common BLAST tools that can be used by any enforcement lab without profound bioinformatics expertise, was successfully used toanalyse the B. subtilis data generated by NGS. The results were used to design and assess a new TaqMan® qPCR method, specifically detecting this GM vitamin B2 overproducing bacterium. The method complies with EU critical performance parameters for specificity, sensitivity, PCR efficiency and repeatability. The VitB2-UGM method also could detect the B. subtilis strain in genomic DNA extracted from the feed additive, without prior culturing step. The proposed method, provides a crucial tool for specifically and rapidly identifying this unauthorized GM bacterium in food and feed additives by enforcement laboratories. Moreover, this work can be seen as a case study to substantiate how the use of NGS data can offer an added value to easily gain access to sequence information needed to develop qPCR methods to detect unknown andunauthorized GMO in food and feed.

  14. Species and tissues specific differentiation of processed animal proteins in aquafeeds using proteomics tools.

    PubMed

    Rasinger, J D; Marbaix, H; Dieu, M; Fumière, O; Mauro, S; Palmblad, M; Raes, M; Berntssen, M H G

    2016-09-16

    The rapidly growing aquaculture industry drives the search for sustainable protein sources in fish feed. In the European Union (EU) since 2013 non-ruminant processed animal proteins (PAP) are again permitted to be used in aquafeeds. To ensure that commercial fish feeds do not contain PAP from prohibited species, EU reference methods were established. However, due to the heterogeneous and complex nature of PAP complementary methods are required to guarantee the safe use of this fish feed ingredient. In addition, there is a need for tissue specific PAP detection to identify the sources (i.e. bovine carcass, blood, or meat) of illegal PAP use. In the present study, we investigated and compared different protein extraction, solubilisation and digestion protocols on different proteomics platforms for the detection and differentiation of prohibited PAP. In addition, we assessed if tissue specific PAP detection was feasible using proteomics tools. All work was performed independently in two different laboratories. We found that irrespective of sample preparation gel-based proteomics tools were inappropriate when working with PAP. Gel-free shotgun proteomics approaches in combination with direct spectral comparison were able to provide quality species and tissue specific data to complement and refine current methods of PAP detection and identification. To guarantee the safe use of processed animal protein (PAP) in aquafeeds efficient PAP detection and monitoring tools are required. The present study investigated and compared various proteomics workflows and shows that the application of shotgun proteomics in combination with direct comparison of spectral libraries provides for the desired species and tissue specific classification of this heat sterilized and pressure treated (≥133°C, at 3bar for 20min) protein feed ingredient. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Dcode.org anthology of comparative genomic tools.

    PubMed

    Loots, Gabriela G; Ovcharenko, Ivan

    2005-07-01

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the non-coding encryption of gene regulation across genomes. To facilitate the practical application of comparative sequence analysis to genetics and genomics, we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools, zPicture and Mulan; a phylogenetic shadowing tool, eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools, rVista and multiTF; a tool for extracting cis-regulatory modules governing the expression of co-regulated genes, Creme 2.0; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here, we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ website.

  16. Method for detecting the signature of noise-induced structures in spatiotemporal data sets: an application to excitable media

    NASA Astrophysics Data System (ADS)

    Huett, Marc-Thorsten

    2003-05-01

    We formulate mathematical tools for analyzing spatiotemporal data sets. The tools are based on nearest-neighbor considerations similar to cellular automata. One of the analysis tools allows for reconstructing the noise intensity in a data set and is an appropriate method for detecting a variety of noise-induced phenomena in spatiotemporal data. The functioning of these methods is illustrated on sample data generated with the forest fire model and with networks of nonlinear oscillators. It is seen that these methods allow the characterization of spatiotemporal stochastic resonance (STSR) in experimental data. Application of these tools to biological spatiotemporal patterns is discussed. For one specific example, the slime mold Dictyostelium discoideum, it is seen, how transitions between different patterns are clearly marked by changes in the spatiotemporal observables.

  17. A Method for Making Cross-Comparable Estimates of the Benefits of Decision Support Technologies for Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Lee, David; Long, Dou; Etheridge, Mel; Plugge, Joana; Johnson, Jesse; Kostiuk, Peter

    1998-01-01

    We present a general method for making cross comparable estimates of the benefits of NASA-developed decision support technologies for air traffic management, and we apply a specific implementation of the method to estimate benefits of three decision support tools (DSTs) under development in NASA's advanced Air Transportation Technologies Program: Active Final Approach Spacing Tool (A-FAST), Expedite Departure Path (EDP), and Conflict Probe and Trial Planning Tool (CPTP). The report also reviews data about the present operation of the national airspace system (NAS) to identify opportunities for DST's to reduce delays and inefficiencies.

  18. MusiteDeep: a deep-learning framework for general and kinase-specific phosphorylation site prediction.

    PubMed

    Wang, Duolin; Zeng, Shuai; Xu, Chunhui; Qiu, Wangren; Liang, Yanchun; Joshi, Trupti; Xu, Dong

    2017-12-15

    Computational methods for phosphorylation site prediction play important roles in protein function studies and experimental design. Most existing methods are based on feature extraction, which may result in incomplete or biased features. Deep learning as the cutting-edge machine learning method has the ability to automatically discover complex representations of phosphorylation patterns from the raw sequences, and hence it provides a powerful tool for improvement of phosphorylation site prediction. We present MusiteDeep, the first deep-learning framework for predicting general and kinase-specific phosphorylation sites. MusiteDeep takes raw sequence data as input and uses convolutional neural networks with a novel two-dimensional attention mechanism. It achieves over a 50% relative improvement in the area under the precision-recall curve in general phosphorylation site prediction and obtains competitive results in kinase-specific prediction compared to other well-known tools on the benchmark data. MusiteDeep is provided as an open-source tool available at https://github.com/duolinwang/MusiteDeep. xudong@missouri.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    PubMed

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  20. Determination of high-strength materials diamond grinding rational modes

    NASA Astrophysics Data System (ADS)

    Arkhipov, P. V.; Lobanov, D. V.; Rychkov, D. A.; Yanyushkin, A. S.

    2018-03-01

    The analysis of methods of high-strength materials abrasive processing is carried out. This method made it possible to determine the necessary directions and prospects for the development of shaping combined methods. The need to use metal bonded diamond abrasive tools in combination with a different kind of energy is noted to improve the processing efficiency and reduce the complexity of operations. The complex of experimental research on revealing the importance of mechanical and electrical components of cutting regimes, on the cutting ability of diamond tools, as well as the need to reduce the specific consumption of an abrasive wheel as one of the important economic indicators of the processing process is performed. It is established that combined diamond grinding with simultaneous continuous correction of the abrasive wheel contributes to an increase in the cutting ability of metal bonded diamond abrasive tools when processing high-strength materials by an average of 30% compared to diamond grinding. Particular recommendations on the designation of technological factors are developed depending on specific production problems.

  1. Fan Noise Prediction with Applications to Aircraft System Noise Assessment

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Envia, Edmane; Burley, Casey L.

    2009-01-01

    This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.

  2. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  3. DRS: Derivational Reasoning System

    NASA Technical Reports Server (NTRS)

    Bose, Bhaskar

    1995-01-01

    The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.

  4. 40 CFR 146.95 - Class VI injection depth waiver requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... detection tools), unless the Director determines, based on site-specific geology, that such methods are not... geology, that such methods are not appropriate; (5) Any additional requirements requested by the Director...

  5. DCODE.ORG Anthology of Comparative Genomic Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loots, G G; Ovcharenko, I

    2005-01-11

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the noncoding encryption of gene regulation across genomes. To facilitate the use of comparative genomics to practical applications in genetics and genomics we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools: zPicture and Mulan; a phylogenetic shadowing tool: eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools: rVista and multiTF; a toolmore » for extracting cis-regulatory modules governing the expression of co-regulated genes, CREME; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ web site.« less

  6. A fractured rock geophysical toolbox method selection tool

    USGS Publications Warehouse

    Day-Lewis, F. D.; Johnson, C.D.; Slater, L.D.; Robinson, J.L.; Williams, J.H.; Boyden, C.L.; Werkema, D.D.; Lane, J.W.

    2016-01-01

    Geophysical technologies have the potential to improve site characterization and monitoring in fractured rock, but the appropriate and effective application of geophysics at a particular site strongly depends on project goals (e.g., identifying discrete fractures) and site characteristics (e.g., lithology). No method works at every site or for every goal. New approaches are needed to identify a set of geophysical methods appropriate to specific project goals and site conditions while considering budget constraints. To this end, we present the Excel-based Fractured-Rock Geophysical Toolbox Method Selection Tool (FRGT-MST). We envision the FRGT-MST (1) equipping remediation professionals with a tool to understand what is likely to be realistic and cost-effective when contracting geophysical services, and (2) reducing applications of geophysics with unrealistic objectives or where methods are likely to fail.

  7. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  8. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Specification and Design Methodologies for High-Speed Fault-Tolerant Array Algorithms and Structures for VLSI.

    DTIC Science & Technology

    1987-06-01

    evaluation and chip layout planning for VLSI digital systems. A high-level applicative (functional) language, implemented at UCLA, allows combining of...operating system. 2.1 Introduction The complexity of VLSI requires the application of CAD tools at all levels of the design process. In order to be...effective, these tools must be adaptive to the specific design. In this project we studied a design method based on the use of applicative languages

  10. Methods, Tools and Current Perspectives in Proteogenomics *

    PubMed Central

    Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.

    2017-01-01

    With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751

  11. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  12. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  13. Sensing site-specific structural characteristics and chirality using vibrational circular dichroism of isotope labeled peptides.

    PubMed

    Keiderling, Timothy A

    2017-12-01

    Isotope labeling has a long history in chemistry as a tool for probing structure, offering enhanced sensitivity, or enabling site selection with a wide range of spectroscopic tools. Chirality sensitive methods such as electronic circular dichroism are global structural tools and have intrinsically low resolution. Consequently, they are generally insensitive to modifications to enhance site selectivity. The use of isotope labeling to modify vibrational spectra with unique resolvable frequency shifts can provide useful site-specific sensitivity, and these methods have been recently more widely expanded in biopolymer studies. While the spectral shifts resulting from changes in isotopic mass can provide resolution of modes from specific parts of the molecule and can allow detection of local change in structure with perturbation, these shifts alone do not directly indicate structure or chirality. With vibrational circular dichroism (VCD), the shifted bands and their resultant sign patterns can be used to indicate local conformations in labeled biopolymers, particularly if multiple labels are used and if their coupling is theoretically modeled. This mini-review discusses selected examples of the use of labeling specific amides in peptides to develop local structural insight with VCD spectra. © 2017 Wiley Periodicals, Inc.

  14. [Development of HIV infection risk assessment tool for men who have sex with men based on Delphi method].

    PubMed

    Li, L L; Jiang, Z; Song, W L; Ding, Y Y; Xu, J; He, N

    2017-10-10

    Objective: To develop a HIV infection risk assessment tool for men who have sex with men (MSM) based on Delphi method. Methods: After an exhaustive literature review, we used Delphi method to determine the specific items and relative risk scores of the assessment tool through two rounds of specialist consultation and overall consideration of the opinions and suggestions of 17 specialists. Results: The positivity coefficient through first and second round specialist consultation was 100.0 % and 94.1 % , respectively. The mean of authority coefficients ( Cr ) was 0.86. Kendall's W coefficient of the specialist consultation was 0.55 for the first round consultation (χ(2)=84.426, P <0.001) and 0.46 for the second round consultation (χ(2)=65.734, P <0.001), respectively, suggesting that the specialists had similar opinions. The final HIV infection risk assessment tool for MSM has 8 items. Conclusions: The HIV infection risk assessment tool for MSM, developed under the Delphi method, can be used in the evaluation of HIV infection risk in MSM and individualized prevention and intervention. However, the reliability and validity of this risk assessment tool need to be further evaluated.

  15. CRISPR/Cas9-loxP-Mediated Gene Editing as a Novel Site-Specific Genetic Manipulation Tool.

    PubMed

    Yang, Fayu; Liu, Changbao; Chen, Ding; Tu, Mengjun; Xie, Haihua; Sun, Huihui; Ge, Xianglian; Tang, Lianchao; Li, Jin; Zheng, Jiayong; Song, Zongming; Qu, Jia; Gu, Feng

    2017-06-16

    Cre-loxP, as one of the site-specific genetic manipulation tools, offers a method to study the spatial and temporal regulation of gene expression/inactivation in order to decipher gene function. CRISPR/Cas9-mediated targeted genome engineering technologies are sparking a new revolution in biological research. Whether the traditional site-specific genetic manipulation tool and CRISPR/Cas9 could be combined to create a novel genetic tool for highly specific gene editing is not clear. Here, we successfully generated a CRISPR/Cas9-loxP system to perform gene editing in human cells, providing the proof of principle that these two technologies can be used together for the first time. We also showed that distinct non-homologous end-joining (NHEJ) patterns from CRISPR/Cas9-mediated gene editing of the targeting sequence locates at the level of plasmids (episomal) and chromosomes. Specially, the CRISPR/Cas9-mediated NHEJ pattern in the nuclear genome favors deletions (64%-68% at the human AAVS1 locus versus 4%-28% plasmid DNA). CRISPR/Cas9-loxP, a novel site-specific genetic manipulation tool, offers a platform for the dissection of gene function and molecular insights into DNA-repair pathways. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  16. Methods for assessing reliability and validity for a measurement tool: a case study and critique using the WHO haemoglobin colour scale.

    PubMed

    White, Sarah A; van den Broek, Nynke R

    2004-05-30

    Before introducing a new measurement tool it is necessary to evaluate its performance. Several statistical methods have been developed, or used, to evaluate the reliability and validity of a new assessment method in such circumstances. In this paper we review some commonly used methods. Data from a study that was conducted to evaluate the usefulness of a specific measurement tool (the WHO Colour Scale) is then used to illustrate the application of these methods. The WHO Colour Scale was developed under the auspices of the WHO to provide a simple portable and reliable method of detecting anaemia. This Colour Scale is a discrete interval scale, whereas the actual haemoglobin values it is used to estimate are on a continuous interval scale and can be measured accurately using electrical laboratory equipment. The methods we consider are: linear regression, correlation coefficients, paired t-tests plotting differences against mean values and deriving limits of agreement; kappa and weighted kappa statistics, sensitivity and specificity, an intraclass correlation coefficient and the repeatability coefficient. We note that although the definition and properties of each of these methods is well established inappropriate methods continue to be used in medical literature for assessing reliability and validity, as evidenced in the context of the evaluation of the WHO Colour Scale. Copyright 2004 John Wiley & Sons, Ltd.

  17. 14 CFR 65.101 - Eligibility requirements: General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... months of practical experience in the procedures, practices, inspection methods, materials, tools, machine tools, and equipment generally used in the maintenance duties of the specific job for which the... employed; and (6) Be able to read, write, speak, and understand the English language, or, in the case of an...

  18. Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration

    PubMed Central

    Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.

    2016-01-01

    BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for better effectiveness for protecting the fingers. PMID:27867313

  19. Friendly Extensible Transfer Tool Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.

    2016-04-15

    Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).

  20. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  1. The study of non-fouling and non-specific cellular binding on functionalized surface for mammalian cell identification and manipulation

    NASA Astrophysics Data System (ADS)

    Zainudin, Nor Syuhada; Hambali, Nor Azura Malini Ahmad; Wahid, Mohamad Halim Abd; Retnasamy, Vithyacharan; Shahimin, Mukhzeer Mohamad

    2017-04-01

    Surface functionalization has emerged as a powerful tool for mapping limitless surface-cell membrane interaction in diverse biomolecular applications. Inhibition of non-specific biomolecular and cellular adhesion to solid surfaces is critical in improving the performance of some biomedical devices, particularly for in vitro bioassays. Some factors have to be paid particular attention in determining the right surface modification which are the types of surface, the methods and chemical solution that being used during the experimentation and also tools for analyzing the results. Improved surface functionalization technologies that provide better non-fouling performance in conjunction with specific attachment chemistries are sought for these applications. Hence, this paper serves as a review for multiple surface treatment methods including PEG grafting, adsorptive chemistries, self-assembled monolayers (SAMs) and plasma treatments.

  2. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  3. Understanding new “exploratory” biomarker data: a first look at observed concentrations and associated detection limits

    EPA Science Inventory

    This editorial is the first of a series that each explains one practical aspect of statistics specifically tailored for biomarker data. Each editorial is focused on a very specific concept and gives the rationale, specific method, and a real-world example of a useful tool for da...

  4. Simulation of Trajectories for High Specific Impulse Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Difficulties in approximating flight times and deliverable masses for continuous thrust propulsion systems have complicated comparison and evaluation of proposed propulsion concepts. These continuous thrust propulsion systems are of interest to many groups, not the least of which are the electric propulsion and fusion communities. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. The analytical method derived in the companion paper was also used to simulate the trajectory. The accuracy of this method is discussed in the paper.

  5. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  6. Homotopy method for optimization of variable-specific-impulse low-thrust trajectories

    NASA Astrophysics Data System (ADS)

    Chi, Zhemin; Yang, Hongwei; Chen, Shiyu; Li, Junfeng

    2017-11-01

    The homotopy method has been used as a useful tool in solving fuel-optimal trajectories with constant-specific-impulse low thrust. However, the specific impulse is often variable for many practical solar electric power-limited thrusters. This paper investigates the application of the homotopy method for optimization of variable-specific-impulse low-thrust trajectories. Difficulties arise when the two commonly-used homotopy functions are employed for trajectory optimization. The optimal power throttle level and the optimal specific impulse are coupled with the commonly-used quadratic and logarithmic homotopy functions. To overcome these difficulties, a modified logarithmic homotopy function is proposed to serve as a gateway for trajectory optimization, leading to decoupled expressions of both the optimal power throttle level and the optimal specific impulse. The homotopy method based on this homotopy function is proposed. Numerical simulations validate the feasibility and high efficiency of the proposed method.

  7. Equation-free analysis of agent-based models and systematic parameter determination

    NASA Astrophysics Data System (ADS)

    Thomas, Spencer A.; Lloyd, David J. B.; Skeldon, Anne C.

    2016-12-01

    Agent based models (ABM)s are increasingly used in social science, economics, mathematics, biology and computer science to describe time dependent systems in circumstances where a description in terms of equations is difficult. Yet few tools are currently available for the systematic analysis of ABM behaviour. Numerical continuation and bifurcation analysis is a well-established tool for the study of deterministic systems. Recently, equation-free (EF) methods have been developed to extend numerical continuation techniques to systems where the dynamics are described at a microscopic scale and continuation of a macroscopic property of the system is considered. To date, the practical use of EF methods has been limited by; (1) the over-head of application-specific implementation; (2) the laborious configuration of problem-specific parameters; and (3) large ensemble sizes (potentially) leading to computationally restrictive run-times. In this paper we address these issues with our tool for the EF continuation of stochastic systems, which includes algorithms to systematically configuration problem specific parameters and enhance robustness to noise. Our tool is generic and can be applied to any 'black-box' simulator and determines the essential EF parameters prior to EF analysis. Robustness is significantly improved using our convergence-constraint with a corrector-repeat (C3R) method. This algorithm automatically detects outliers based on the dynamics of the underlying system enabling both an order of magnitude reduction in ensemble size and continuation of systems at much higher levels of noise than classical approaches. We demonstrate our method with application to several ABM models, revealing parameter dependence, bifurcation and stability analysis of these complex systems giving a deep understanding of the dynamical behaviour of the models in a way that is not otherwise easily obtainable. In each case we demonstrate our systematic parameter determination stage for configuring the system specific EF parameters.

  8. Development, Evaluation, and Validation of Environmental Assessment Tools to Evaluate the College Nutrition Environment

    ERIC Educational Resources Information Center

    Freedman, Marjorie R.

    2010-01-01

    Objective: To develop, evaluate, and validate 2 nutrition environment assessment tools (surveys), for specific use in combating overweight on college/university campuses. Participants and Methods: Invitations to complete surveys were e-mailed to food service and health center directors at 47 universities, Winter 2008. Overall response rate was…

  9. Development and methods for an open-sourced data visualization tool

    USDA-ARS?s Scientific Manuscript database

    This paper presents an open source on-demand web tool, which is specifically addressed to scientists and researchers that are non-expert in converting time series data into a time surface visualization. Similar to a GIS environment the time surface shows time on two axes; time of day vs. day of year...

  10. Detecting Surgical Tools by Modelling Local Appearance and Global Shape.

    PubMed

    Bouget, David; Benenson, Rodrigo; Omran, Mohamed; Riffaud, Laurent; Schiele, Bernt; Jannin, Pierre

    2015-12-01

    Detecting tools in surgical videos is an important ingredient for context-aware computer-assisted surgical systems. To this end, we present a new surgical tool detection dataset and a method for joint tool detection and pose estimation in 2d images. Our two-stage pipeline is data-driven and relaxes strong assumptions made by previous works regarding the geometry, number, and position of tools in the image. The first stage classifies each pixel based on local appearance only, while the second stage evaluates a tool-specific shape template to enforce global shape. Both local appearance and global shape are learned from training data. Our method is validated on a new surgical tool dataset of 2 476 images from neurosurgical microscopes, which is made freely available. It improves over existing datasets in size, diversity and detail of annotation. We show that our method significantly improves over competitive baselines from the computer vision field. We achieve 15% detection miss-rate at 10(-1) false positives per image (for the suction tube) over our surgical tool dataset. Results indicate that performing semantic labelling as an intermediate task is key for high quality detection.

  11. Drug Target Validation Methods in Malaria - Protein Interference Assay (PIA) as a Tool for Highly Specific Drug Target Validation.

    PubMed

    Meissner, Kamila A; Lunev, Sergey; Wang, Yuan-Ze; Linzke, Marleen; de Assis Batista, Fernando; Wrenger, Carsten; Groves, Matthew R

    2017-01-01

    The validation of drug targets in malaria and other human diseases remains a highly difficult and laborious process. In the vast majority of cases, highly specific small molecule tools to inhibit a proteins function in vivo are simply not available. Additionally, the use of genetic tools in the analysis of malarial pathways is challenging. These issues result in difficulties in specifically modulating a hypothetical drug target's function in vivo. The current "toolbox" of various methods and techniques to identify a protein's function in vivo remains very limited and there is a pressing need for expansion. New approaches are urgently required to support target validation in the drug discovery process. Oligomerisation is the natural assembly of multiple copies of a single protein into one object and this self-assembly is present in more than half of all protein structures. Thus, oligomerisation plays a central role in the generation of functional biomolecules. A key feature of oligomerisation is that the oligomeric interfaces between the individual parts of the final assembly are highly specific. However, these interfaces have not yet been systematically explored or exploited to dissect biochemical pathways in vivo. This mini review will describe the current state of the antimalarial toolset as well as the potentially druggable malarial pathways. A specific focus is drawn to the initial efforts to exploit oligomerisation surfaces in drug target validation. As alternative to the conventional methods, Protein Interference Assay (PIA) can be used for specific distortion of the target protein function and pathway assessment in vivo. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. Deep learning with word embeddings improves biomedical named entity recognition.

    PubMed

    Habibi, Maryam; Weber, Leon; Neves, Mariana; Wiegandt, David Luis; Leser, Ulf

    2017-07-15

    Text mining has become an important tool for biomedical research. The most fundamental text-mining task is the recognition of biomedical named entities (NER), such as genes, chemicals and diseases. Current NER methods rely on pre-defined features which try to capture the specific surface properties of entity types, properties of the typical local context, background knowledge, and linguistic information. State-of-the-art tools are entity-specific, as dictionaries and empirically optimal feature sets differ between entity types, which makes their development costly. Furthermore, features are often optimized for a specific gold standard corpus, which makes extrapolation of quality measures difficult. We show that a completely generic method based on deep learning and statistical word embeddings [called long short-term memory network-conditional random field (LSTM-CRF)] outperforms state-of-the-art entity-specific NER tools, and often by a large margin. To this end, we compared the performance of LSTM-CRF on 33 data sets covering five different entity classes with that of best-of-class NER tools and an entity-agnostic CRF implementation. On average, F1-score of LSTM-CRF is 5% above that of the baselines, mostly due to a sharp increase in recall. The source code for LSTM-CRF is available at https://github.com/glample/tagger and the links to the corpora are available at https://corposaurus.github.io/corpora/ . habibima@informatik.hu-berlin.de. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. Deep learning with word embeddings improves biomedical named entity recognition

    PubMed Central

    Habibi, Maryam; Weber, Leon; Neves, Mariana; Wiegandt, David Luis; Leser, Ulf

    2017-01-01

    Abstract Motivation: Text mining has become an important tool for biomedical research. The most fundamental text-mining task is the recognition of biomedical named entities (NER), such as genes, chemicals and diseases. Current NER methods rely on pre-defined features which try to capture the specific surface properties of entity types, properties of the typical local context, background knowledge, and linguistic information. State-of-the-art tools are entity-specific, as dictionaries and empirically optimal feature sets differ between entity types, which makes their development costly. Furthermore, features are often optimized for a specific gold standard corpus, which makes extrapolation of quality measures difficult. Results: We show that a completely generic method based on deep learning and statistical word embeddings [called long short-term memory network-conditional random field (LSTM-CRF)] outperforms state-of-the-art entity-specific NER tools, and often by a large margin. To this end, we compared the performance of LSTM-CRF on 33 data sets covering five different entity classes with that of best-of-class NER tools and an entity-agnostic CRF implementation. On average, F1-score of LSTM-CRF is 5% above that of the baselines, mostly due to a sharp increase in recall. Availability and implementation: The source code for LSTM-CRF is available at https://github.com/glample/tagger and the links to the corpora are available at https://corposaurus.github.io/corpora/. Contact: habibima@informatik.hu-berlin.de PMID:28881963

  14. Multiple testing corrections in quantitative proteomics: A useful but blunt tool.

    PubMed

    Pascovici, Dana; Handler, David C L; Wu, Jemma X; Haynes, Paul A

    2016-09-01

    Multiple testing corrections are a useful tool for restricting the FDR, but can be blunt in the context of low power, as we demonstrate by a series of simple simulations. Unfortunately, in proteomics experiments low power can be common, driven by proteomics-specific issues like small effects due to ratio compression, and few replicates due to reagent high cost, instrument time availability and other issues; in such situations, most multiple testing corrections methods, if used with conventional thresholds, will fail to detect any true positives even when many exist. In this low power, medium scale situation, other methods such as effect size considerations or peptide-level calculations may be a more effective option, even if they do not offer the same theoretical guarantee of a low FDR. Thus, we aim to highlight in this article that proteomics presents some specific challenges to the standard multiple testing corrections methods, which should be employed as a useful tool but not be regarded as a required rubber stamp. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Nontechnical skill training and the use of scenarios in modern surgical education.

    PubMed

    Brunckhorst, Oliver; Khan, Muhammad S; Dasgupta, Prokar; Ahmed, Kamran

    2017-07-01

    Nontechnical skills are being increasingly recognized as a core reason of surgical errors. Combined with the changing nature of surgical training, there has therefore been an increase in nontechnical skill research in the literature. This review therefore aims to: define nontechnical skillsets, assess current training methods, explore assessment modalities and suggest future research aims. The literature demonstrates an increasing understanding of the components of nontechnical skills within surgery. This has led to a greater availability of validated training methods for its training, including the use of didactic teaching, e-learning and simulation-based scenarios. In addition, there are now various extensively validated assessment tools for nontechnical skills including NOTSS, the Oxford NOTECHS and OTAS. Finally, there is now more focus on the development of tools which target individual nontechnical skill components and an attempt to understand which of these play a greater role in specific procedures such as laparoscopic or robotic surgery. Current evidence demonstrates various training methods and tools for the training of nontechnical skills. Future research is likely to focus increasingly on individual nontechnical skill components and procedure-specific skills.

  16. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications.

    PubMed

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-11-17

    Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends' preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.

  17. Calysto: Risk Management for Commercial Manned Spaceflight

    NASA Technical Reports Server (NTRS)

    Dillaman, Gary

    2012-01-01

    The Calysto: Risk Management for Commercial Manned Spaceflight study analyzes risk management in large enterprises and how to effectively communicate risks across organizations. The Calysto Risk Management tool developed by NASA's Kennedy Space Center's SharePoint team is used and referenced throughout the study. Calysto is a web-base tool built on Microsoft's SharePoint platform. The risk management process at NASA is examined and incorporated in the study. Using risk management standards from industry and specific organizations at the Kennedy Space Center, three methods of communicating and elevating risk are examined. Each method describes details of the effectiveness and plausibility of using the method in the Calysto Risk Management Tool. At the end of the study suggestions are made for future renditions of Calysto.

  18. Computational and mathematical methods in brain atlasing.

    PubMed

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  19. Dereplication, Aggregation and Scoring Tool (DAS Tool) v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SIEBER, CHRISTIAN

    Communities of uncultivated microbes are critical to ecosystem function and microorganism health, and a key objective of metagenomic studies is to analyze organism-specific metabolic pathways and reconstruct community interaction networks. This requires accurate assignment of genes to genomes, yet existing binning methods often fail to predict a reasonable number of genomes and report many bins of low quality and completeness. Furthermore, the performance of existing algorithms varies between samples and biotypes. Here, we present a dereplication, aggregation and scoring strategy, DAS Tool, that combines the strengths of a flexible set of established binning algorithms. DAS Tools applied to a constructedmore » community generated more accurate bins than any automated method. Further, when applied to samples of different complexity, including soil, natural oil seeps, and the human gut, DAS Tool recovered substantially more near-complete genomes than any single binning method alone. Included were three genomes from a novel lineage . The ability to reconstruct many near-complete genomes from metagenomics data will greatly advance genome-centric analyses of ecosystems.« less

  20. Using component technologies for web based wavelet enhanced mammographic image visualization.

    PubMed

    Sakellaropoulos, P; Costaridou, L; Panayiotakis, G

    2000-01-01

    The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.

  1. PSSMSearch: a server for modeling, visualization, proteome-wide discovery and annotation of protein motif specificity determinants.

    PubMed

    Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E

    2018-06-05

    There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.

  2. A method for interactive specification of multiple-block topologies

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen M.

    1991-01-01

    A method is presented for dealing with the vast amount of topological and other data which must be specified to generate a multiple-block computational grid. Specific uses of the graphical capabilities of a powerful scientific workstation are described which reduce the burden on the user of collecting and formatting such large amounts of data. A program to implement this method, 3DPREP, is described. A plotting transformation algorithm, some useful software tools, notes on programming, and a database organization are also presented. Example grids developed using the method are shown.

  3. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  4. [Clinical applications of molecular imaging methods for patients with ischemic stroke].

    PubMed

    Yamauchi, Hiroshi; Fukuyama, Hidenao

    2007-02-01

    Several molecular imaging methods have been developed to visualize pathophysiology of cerebral ischemia in humans in vivo. PET and SPECT with specific ligands have been mainly used as diagnostic tools for the clinical usage of molecular imaging in patients with ischemic stroke. Recently, cellular MR imaging with specific contrast agents has been developed to visualize targeted cells in human stroke patients. This article reviews the current status in the clinical applications of those molecular imaging methods for patients with ischemic stroke.

  5. Traumatic Brain Injury Detection Using Electrophysiological Methods

    PubMed Central

    Rapp, Paul E.; Keyser, David O.; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B.; Zambon, Robert A.; Hairston, W. David; Hughes, John D.; Krystal, Andrew; Nichols, Andrew S.

    2015-01-01

    Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in qEEG evaluations of TBI must be interpreted with care. High specificities have been reported in carefully constructed clinical studies in which healthy controls were compared against a carefully selected TBI population. The published literature indicates, however, that similar abnormalities in qEEG measures are observed in other neuropsychiatric disorders. While it may be possible to distinguish a clinical patient from a healthy control participant with this technology, these measures are unlikely to discriminate between, for example, major depressive disorder, bipolar disorder, or TBI. The specificities observed in these clinical studies may well be lost in real world clinical practice. (5) The absence of specificity does not preclude clinical utility. The possibility of use as a longitudinal measure of treatment response remains. However, efficacy as a longitudinal clinical measure does require acceptable test–retest reliability. To date, very few test–retest reliability studies have been published with qEEG data obtained from TBI patients or from healthy controls. This is a particular concern because high variability is a known characteristic of the injured central nervous system. PMID:25698950

  6. Traumatic brain injury detection using electrophysiological methods.

    PubMed

    Rapp, Paul E; Keyser, David O; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B; Zambon, Robert A; Hairston, W David; Hughes, John D; Krystal, Andrew; Nichols, Andrew S

    2015-01-01

    Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in qEEG evaluations of TBI must be interpreted with care. High specificities have been reported in carefully constructed clinical studies in which healthy controls were compared against a carefully selected TBI population. The published literature indicates, however, that similar abnormalities in qEEG measures are observed in other neuropsychiatric disorders. While it may be possible to distinguish a clinical patient from a healthy control participant with this technology, these measures are unlikely to discriminate between, for example, major depressive disorder, bipolar disorder, or TBI. The specificities observed in these clinical studies may well be lost in real world clinical practice. (5) The absence of specificity does not preclude clinical utility. The possibility of use as a longitudinal measure of treatment response remains. However, efficacy as a longitudinal clinical measure does require acceptable test-retest reliability. To date, very few test-retest reliability studies have been published with qEEG data obtained from TBI patients or from healthy controls. This is a particular concern because high variability is a known characteristic of the injured central nervous system.

  7. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications

    PubMed Central

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-01-01

    Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172

  8. Machine Learning-Based App for Self-Evaluation of Teacher-Specific Instructional Style and Tools

    ERIC Educational Resources Information Center

    Duzhin, Fedor; Gustafsson, Anders

    2018-01-01

    Course instructors need to assess the efficacy of their teaching methods, but experiments in education are seldom politically, administratively, or ethically feasible. Quasi-experimental tools, on the other hand, are often problematic, as they are typically too complicated to be of widespread use to educators and may suffer from selection bias…

  9. Asking Questions in the Classroom: An Exploration of Tools and Techniques Used in the Library Instruction Classroom

    ERIC Educational Resources Information Center

    Whitver, Sara Maurice; Lo, Leo S.

    2017-01-01

    This study explores the tools and techniques used within the library instruction classroom to facilitate a conversation about teaching practices. Researchers focused on the questioning methods employed by librarians, specifically the number of questions asked by librarians and students. This study was comprised of classroom observations of a team…

  10. Freely available compound data sets and software tools for chemoinformatics and computational medicinal chemistry applications

    PubMed Central

    Bajorath, Jurgen

    2012-01-01

    We have generated a number of  compound data sets and programs for different types of applications in pharmaceutical research. These data sets and programs were originally designed for our research projects and are made publicly available. Without consulting original literature sources, it is difficult to understand specific features of data sets and software tools, basic ideas underlying their design, and applicability domains. Currently, 30 different entries are available for download from our website. In this data article, we provide an overview of the data and tools we make available and designate the areas of research for which they should be useful. For selected data sets and methods/programs, detailed descriptions are given. This article should help interested readers to select data and tools for specific computational investigations. PMID:24358818

  11. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  12. Methods of epigenome editing for probing the function of genomic imprinting.

    PubMed

    Rienecker, Kira DA; Hill, Matthew J; Isles, Anthony R

    2016-10-01

    The curious patterns of imprinted gene expression draw interest from several scientific disciplines to the functional consequences of genomic imprinting. Methods of probing the function of imprinting itself have largely been indirect and correlational, relying heavily on conventional transgenics. Recently, the burgeoning field of epigenome editing has provided new tools and suggested strategies for asking causal questions with site specificity. This perspective article aims to outline how these new methods may be applied to questions of functional imprinting and, with this aim in mind, to suggest new dimensions for the expansion of these epigenome-editing tools.

  13. A new optimization tool path planning for 3-axis end milling of free-form surfaces based on efficient machining intervals

    NASA Astrophysics Data System (ADS)

    Vu, Duy-Duc; Monies, Frédéric; Rubio, Walter

    2018-05-01

    A large number of studies, based on 3-axis end milling of free-form surfaces, seek to optimize tool path planning. Approaches try to optimize the machining time by reducing the total tool path length while respecting the criterion of the maximum scallop height. Theoretically, the tool path trajectories that remove the most material follow the directions in which the machined width is the largest. The free-form surface is often considered as a single machining area. Therefore, the optimization on the entire surface is limited. Indeed, it is difficult to define tool trajectories with optimal feed directions which generate largest machined widths. Another limiting point of previous approaches for effectively reduce machining time is the inadequate choice of the tool. Researchers use generally a spherical tool on the entire surface. However, the gains proposed by these different methods developed with these tools lead to relatively small time savings. Therefore, this study proposes a new method, using toroidal milling tools, for generating toolpaths in different regions on the machining surface. The surface is divided into several regions based on machining intervals. These intervals ensure that the effective radius of the tool, at each cutter-contact points on the surface, is always greater than the radius of the tool in an optimized feed direction. A parallel plane strategy is then used on the sub-surfaces with an optimal specific feed direction for each sub-surface. This method allows one to mill the entire surface with efficiency greater than with the use of a spherical tool. The proposed method is calculated and modeled using Maple software to find optimal regions and feed directions in each region. This new method is tested on a free-form surface. A comparison is made with a spherical cutter to show the significant gains obtained with a toroidal milling cutter. Comparisons with CAM software and experimental validations are also done. The results show the efficiency of the method.

  14. Accurate in silico prediction of species-specific methylation sites based on information gain feature optimization.

    PubMed

    Wen, Ping-Ping; Shi, Shao-Ping; Xu, Hao-Dong; Wang, Li-Na; Qiu, Jian-Ding

    2016-10-15

    As one of the most important reversible types of post-translational modification, protein methylation catalyzed by methyltransferases carries many pivotal biological functions as well as many essential biological processes. Identification of methylation sites is prerequisite for decoding methylation regulatory networks in living cells and understanding their physiological roles. Experimental methods are limitations of labor-intensive and time-consuming. While in silicon approaches are cost-effective and high-throughput manner to predict potential methylation sites, but those previous predictors only have a mixed model and their prediction performances are not fully satisfactory now. Recently, with increasing availability of quantitative methylation datasets in diverse species (especially in eukaryotes), there is a growing need to develop a species-specific predictor. Here, we designed a tool named PSSMe based on information gain (IG) feature optimization method for species-specific methylation site prediction. The IG method was adopted to analyze the importance and contribution of each feature, then select the valuable dimension feature vectors to reconstitute a new orderly feature, which was applied to build the finally prediction model. Finally, our method improves prediction performance of accuracy about 15% comparing with single features. Furthermore, our species-specific model significantly improves the predictive performance compare with other general methylation prediction tools. Hence, our prediction results serve as useful resources to elucidate the mechanism of arginine or lysine methylation and facilitate hypothesis-driven experimental design and validation. The tool online service is implemented by C# language and freely available at http://bioinfo.ncu.edu.cn/PSSMe.aspx CONTACT: jdqiu@ncu.edu.cnSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Variable context Markov chains for HIV protease cleavage site prediction.

    PubMed

    Oğul, Hasan

    2009-06-01

    Deciphering the knowledge of HIV protease specificity and developing computational tools for detecting its cleavage sites in protein polypeptide chain are very desirable for designing efficient and specific chemical inhibitors to prevent acquired immunodeficiency syndrome. In this study, we developed a generative model based on a generalization of variable order Markov chains (VOMC) for peptide sequences and adapted the model for prediction of their cleavability by certain proteases. The new method, called variable context Markov chains (VCMC), attempts to identify the context equivalence based on the evolutionary similarities between individual amino acids. It was applied for HIV-1 protease cleavage site prediction problem and shown to outperform existing methods in terms of prediction accuracy on a common dataset. In general, the method is a promising tool for prediction of cleavage sites of all proteases and encouraged to be used for any kind of peptide classification problem as well.

  16. Exploring the single-cell RNA-seq analysis landscape with the scRNA-tools database.

    PubMed

    Zappia, Luke; Phipson, Belinda; Oshlack, Alicia

    2018-06-25

    As single-cell RNA-sequencing (scRNA-seq) datasets have become more widespread the number of tools designed to analyse these data has dramatically increased. Navigating the vast sea of tools now available is becoming increasingly challenging for researchers. In order to better facilitate selection of appropriate analysis tools we have created the scRNA-tools database (www.scRNA-tools.org) to catalogue and curate analysis tools as they become available. Our database collects a range of information on each scRNA-seq analysis tool and categorises them according to the analysis tasks they perform. Exploration of this database gives insights into the areas of rapid development of analysis methods for scRNA-seq data. We see that many tools perform tasks specific to scRNA-seq analysis, particularly clustering and ordering of cells. We also find that the scRNA-seq community embraces an open-source and open-science approach, with most tools available under open-source licenses and preprints being extensively used as a means to describe methods. The scRNA-tools database provides a valuable resource for researchers embarking on scRNA-seq analysis and records the growth of the field over time.

  17. A novel bioinformatics method for efficient knowledge discovery by BLSOM from big genomic sequence data.

    PubMed

    Bai, Yu; Iwasaki, Yuki; Kanaya, Shigehiko; Zhao, Yue; Ikemura, Toshimichi

    2014-01-01

    With remarkable increase of genomic sequence data of a wide range of species, novel tools are needed for comprehensive analyses of the big sequence data. Self-Organizing Map (SOM) is an effective tool for clustering and visualizing high-dimensional data such as oligonucleotide composition on one map. By modifying the conventional SOM, we have previously developed Batch-Learning SOM (BLSOM), which allows classification of sequence fragments according to species, solely depending on the oligonucleotide composition. In the present study, we introduce the oligonucleotide BLSOM used for characterization of vertebrate genome sequences. We first analyzed pentanucleotide compositions in 100 kb sequences derived from a wide range of vertebrate genomes and then the compositions in the human and mouse genomes in order to investigate an efficient method for detecting differences between the closely related genomes. BLSOM can recognize the species-specific key combination of oligonucleotide frequencies in each genome, which is called a "genome signature," and the specific regions specifically enriched in transcription-factor-binding sequences. Because the classification and visualization power is very high, BLSOM is an efficient powerful tool for extracting a wide range of information from massive amounts of genomic sequences (i.e., big sequence data).

  18. Bio-AIMS Collection of Chemoinformatics Web Tools based on Molecular Graph Information and Artificial Intelligence Models.

    PubMed

    Munteanu, Cristian R; Gonzalez-Diaz, Humberto; Garcia, Rafael; Loza, Mabel; Pazos, Alejandro

    2015-01-01

    The molecular information encoding into molecular descriptors is the first step into in silico Chemoinformatics methods in Drug Design. The Machine Learning methods are a complex solution to find prediction models for specific biological properties of molecules. These models connect the molecular structure information such as atom connectivity (molecular graphs) or physical-chemical properties of an atom/group of atoms to the molecular activity (Quantitative Structure - Activity Relationship, QSAR). Due to the complexity of the proteins, the prediction of their activity is a complicated task and the interpretation of the models is more difficult. The current review presents a series of 11 prediction models for proteins, implemented as free Web tools on an Artificial Intelligence Model Server in Biosciences, Bio-AIMS (http://bio-aims.udc.es/TargetPred.php). Six tools predict protein activity, two models evaluate drug - protein target interactions and the other three calculate protein - protein interactions. The input information is based on the protein 3D structure for nine models, 1D peptide amino acid sequence for three tools and drug SMILES formulas for two servers. The molecular graph descriptor-based Machine Learning models could be useful tools for in silico screening of new peptides/proteins as future drug targets for specific treatments.

  19. The impact of leadership and team behavior on standard of care delivered during human patient simulation: a pilot study for undergraduate medical students.

    PubMed

    Carlson, Jim; Min, Elana; Bridges, Diane

    2009-01-01

    Methodology to train team behavior during simulation has received increased attention, but standard performance measures are lacking, especially at the undergraduate level. Our purposes were to develop a reliable team behavior measurement tool and explore the relationship between team behavior and the delivery of an appropriate standard of care specific to the simulated case. Authors developed a unique team measurement tool based on previous work. Trainees participated in a simulated event involving the presentation of acute dyspnea. Performance was rated by separate raters using the team behavior measurement tool. Interrater reliability was assessed. The relationship between team behavior and the standard of care delivered was explored. The instrument proved to be reliable for this case and group of raters. Team behaviors had a positive relationship with the standard of medical care delivered specific to the simulated case. The methods used provide a possible method for training and assessing team performance during simulation.

  20. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  1. Using leaf optical properties to detect ozone effects on foliar biochemistry

    USDA-ARS?s Scientific Manuscript database

    Efficient methods for accurate and meaningful high-throughput plant phenotyping are limiting the development and breeding of stress-tolerant crops. A number of emerging techniques, specifically remote sensing methods, have been identified as promising tools for plant phenotyping. These remote-sensin...

  2. Pest measurement and management

    USDA-ARS?s Scientific Manuscript database

    Pest scouting, whether it is done only with ground scouting methods or using remote sensing with some ground-truthing, is an important tool to aid site-specific crop management. Different pests may be monitored at different times and using different methods. Remote sensing has the potential to provi...

  3. Improving Cognitive Abilities and e-Inclusion in Children with Cerebral Palsy

    NASA Astrophysics Data System (ADS)

    Martinengo, Chiara; Curatelli, Francesco

    Besides overcoming the motor barriers for accessing to computers and Internet, ICT tools can provide a very useful, and often necessary, support for the cognitive development of motor-impaired children with cerebral palsy. In fact, software tools for computation and communication allow teachers to put into effect, in a more complete and efficient way, the learning methods and the educational plans studied for the child. In the present article, after a brief analysis of the general objectives to be pursued for favouring the learning for children with cerebral palsy, we take account of some specific difficulties in the logical-linguistic and logical-mathematical fields, and we show how they can be overcome using general ICT tools and specifically implemented software programs.

  4. Molecular Tools for Diagnosis of Visceral Leishmaniasis: Systematic Review and Meta-Analysis of Diagnostic Test Accuracy

    PubMed Central

    de Ruiter, C. M.; van der Veer, C.; Leeflang, M. M. G.; Deborggraeve, S.; Lucas, C.

    2014-01-01

    Molecular methods have been proposed as highly sensitive tools for the detection of Leishmania parasites in visceral leishmaniasis (VL) patients. Here, we evaluate the diagnostic accuracy of these tools in a meta-analysis of the published literature. The selection criteria were original studies that evaluate the sensitivities and specificities of molecular tests for diagnosis of VL, adequate classification of study participants, and the absolute numbers of true positives and negatives derivable from the data presented. Forty studies met the selection criteria, including PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), and loop-mediated isothermal amplification (LAMP). The sensitivities of the individual studies ranged from 29 to 100%, and the specificities ranged from 25 to 100%. The pooled sensitivity of PCR in whole blood was 93.1% (95% confidence interval [CI], 90.0 to 95.2), and the specificity was 95.6% (95% CI, 87.0 to 98.6). The specificity was significantly lower in consecutive studies, at 63.3% (95% CI, 53.9 to 71.8), due either to true-positive patients not being identified by parasitological methods or to the number of asymptomatic carriers in areas of endemicity. PCR for patients with HIV-VL coinfection showed high diagnostic accuracy in buffy coat and bone marrow, ranging from 93.1 to 96.9%. Molecular tools are highly sensitive assays for Leishmania detection and may contribute as an additional test in the algorithm, together with a clear clinical case definition. We observed wide variety in reference standards and study designs and now recommend consecutively designed studies. PMID:24829226

  5. ReportingTools: an automated result processing and presentation toolkit for high-throughput genomic analyses.

    PubMed

    Huntley, Melanie A; Larson, Jessica L; Chaivorapol, Christina; Becker, Gabriel; Lawrence, Michael; Hackney, Jason A; Kaminker, Joshua S

    2013-12-15

    It is common for computational analyses to generate large amounts of complex data that are difficult to process and share with collaborators. Standard methods are needed to transform such data into a more useful and intuitive format. We present ReportingTools, a Bioconductor package, that automatically recognizes and transforms the output of many common Bioconductor packages into rich, interactive, HTML-based reports. Reports are not generic, but have been individually designed to reflect content specific to the result type detected. Tabular output included in reports is sortable, filterable and searchable and contains context-relevant hyperlinks to external databases. Additionally, in-line graphics have been developed for specific analysis types and are embedded by default within table rows, providing a useful visual summary of underlying raw data. ReportingTools is highly flexible and reports can be easily customized for specific applications using the well-defined API. The ReportingTools package is implemented in R and available from Bioconductor (version ≥ 2.11) at the URL: http://bioconductor.org/packages/release/bioc/html/ReportingTools.html. Installation instructions and usage documentation can also be found at the above URL.

  6. Tools for Practical Psychotherapy: A Transtheoretical Collection (or Interventions Which Have, At Least, Worked for Us).

    PubMed

    Yager, Joel; Feinstein, Robert E

    2017-01-01

    Regardless of their historical and theoretical roots, strategies, tactics, and techniques used in everyday psychotherapy across diverse theoretical schools contain common factors and methods from other specific psychotherapeutic modalities that contribute substantially to psychotherapy outcomes. Common factors include alliance, empathy, goal consensus/collaboration, positive regard/affirmation, and congruence/genuineness, among others. All therapies also recognize that factors specific to therapists impact treatment. Starting with these common factors, we add psychotherapeutic methods from many theoretical orientations to create a collection of clinical tools. We then provide concrete suggestions for enacting psychotherapy interventions, which constitute a transtheoretical collection. We begin with observations made by earlier scholars, our combined clinical and teaching experiences, and oral traditions and clinical pearls passed down from our own supervisors and mentors. We have compiled a list of tools for students with foundational knowledge in the basic forms of psychotherapy, which may expand their use of additional interventions for practicing effective psychotherapy. Our toolbox is organized into 4 categories: Relating; Exploring; Explaining; and Intervening. We note how these tools correspond to items previously published in a list of core psychotherapy competencies. In our view, the toolbox can be used most judiciously by students and practitioners schooled and grounded in frameworks for conducting established psychotherapies. Although they are still a work in progress, these tools can authorize and guide trainees and practitioners to enact specific approaches to psychotherapy utilizing other frameworks. We believe that psychotherapy education and training might benefit from explicitly focusing on the application of such interventions.

  7. Creation and Delphi-method refinement of pediatric disaster triage simulations.

    PubMed

    Cicero, Mark X; Brown, Linda; Overly, Frank; Yarzebski, Jorge; Meckler, Garth; Fuchs, Susan; Tomassoni, Anthony; Aghababian, Richard; Chung, Sarita; Garrett, Andrew; Fagbuyi, Daniel; Adelgais, Kathleen; Goldman, Ran; Parker, James; Auerbach, Marc; Riera, Antonio; Cone, David; Baum, Carl R

    2014-01-01

    There is a need for rigorously designed pediatric disaster triage (PDT) training simulations for paramedics. First, we sought to design three multiple patient incidents for EMS provider training simulations. Our second objective was to determine the appropriate interventions and triage level for each victim in each of the simulations and develop evaluation instruments for each simulation. The final objective was to ensure that each simulation and evaluation tool was free of bias toward any specific PDT strategy. We created mixed-methods disaster simulation scenarios with pediatric victims: a school shooting, a school bus crash, and a multiple-victim house fire. Standardized patients, high-fidelity manikins, and low-fidelity manikins were used to portray the victims. Each simulation had similar acuity of injuries and 10 victims. Examples include children with special health-care needs, gunshot wounds, and smoke inhalation. Checklist-based evaluation tools and behaviorally anchored global assessments of function were created for each simulation. Eight physicians and paramedics from areas with differing PDT strategies were recruited as Subject Matter Experts (SMEs) for a modified Delphi iterative critique of the simulations and evaluation tools. The modified Delphi was managed with an online survey tool. The SMEs provided an expected triage category for each patient. The target for modified Delphi consensus was ≥85%. Using Likert scales and free text, the SMEs assessed the validity of the simulations, including instances of bias toward a specific PDT strategy, clarity of learning objectives, and the correlation of the evaluation tools to the learning objectives and scenarios. After two rounds of the modified Delphi, consensus for expected triage level was >85% for 28 of 30 victims, with the remaining two achieving >85% consensus after three Delphi iterations. To achieve consensus, we amended 11 instances of bias toward a specific PDT strategy and corrected 10 instances of noncorrelation between evaluations and simulation. The modified Delphi process, used to derive novel PDT simulation and evaluation tools, yielded a high degree of consensus among the SMEs, and eliminated biases toward specific PDT strategies in the evaluations. The simulations and evaluation tools may now be tested for reliability and validity as part of a prehospital PDT curriculum.

  8. A Culture-Specific Nutrient Intake Assessment Instrument in Patients with Pulmonary Tuberculosis

    PubMed Central

    Frediani, Jennifer K.; Tukvadze, Nestani; Sanikidze, Ekaterina; Kipiani, Maia; Hebbar, Gautam; Easley, Kirk A.; Shenvi, Neeta; Ramakrishnan, Usha; Tangpricha, Vin; Blumberg, Henry M.; Ziegler, Thomas R.

    2013-01-01

    Background and Aim To develop and evaluate a culture-specific nutrient intake assessment tool for use in adults with pulmonary tuberculosis (TB) in Tbilisi, Georgia. Methods We developed an instrument to measure food intake over 3 consecutive days using a questionnaire format. The tool was then compared to 24 hour food recalls. Food intake data from 31 subjects with TB were analyzed using the Nutrient Database System for Research (NDS-R) dietary analysis program. Paired t-tests, Pearson correlations and intraclass correlation coefficients (ICC) were used to assess the agreement between the two methods of dietary intake for calculated nutrient intakes. Results The Pearson correlation coefficient for mean daily caloric intake between the 2 methods was 0.37 (P = 0.04) with a mean difference of 171 kcals/day (p = 0.34). The ICC was 0.38 (95% CI: 0.03 to 0.64) suggesting the within-patient variability may be larger than between-patient variability. Results for mean daily intake of total fat, total carbohydrate, total protein, retinol, vitamins D and E, thiamine, calcium, sodium, iron, selenium, copper, and zinc between the two assessment methods were also similar. Conclusions This novel nutrient intake assessment tool provided quantitative nutrient intake data from TB patients. These pilot data can inform larger studies in similar populations. PMID:23541173

  9. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  10. Rapid Magnetic Nanobiosensor for the detection of Serratia marcescen

    NASA Astrophysics Data System (ADS)

    Aljabali, Alaa A. A.; Hussein, Emad; Aljumaili, Omar; Zoubi, Mazhar Al; Altrad, Bahaa; Albatayneh, Khaled; Al-razaq, Mutaz A. Abd

    2018-02-01

    The development of rapid, sensitive, accurate and reliable bacterial detection methods are of keen interest to ensure food safety and hospital security. Therefore, the development of a fast, specific, low-cost and trusted methods is in high demand. Magnetic nanoparticles with their unique material properties have been utilized as a tool for pathogen detection. Here, we present a novel iron oxide nanoparticles labeled with specific targeting antibodies to improve specificity and extend the use of nanoparticles as nanosensors. The results indicated that antibody labeled iron oxide platform that binds specifically to Serriata marcescenst in a straightforward method is very specific and sensitive. The system is capable of rapid and specific detection of various clinically relevant bacterial species, with sensitivity down to single bacteria. The generic platform could be used to identify pathogens for a variety of applications rapidly.

  11. A statistical approach to detection of copy number variations in PCR-enriched targeted sequencing data.

    PubMed

    Demidov, German; Simakova, Tamara; Vnuchkova, Julia; Bragin, Anton

    2016-10-22

    Multiplex polymerase chain reaction (PCR) is a common enrichment technique for targeted massive parallel sequencing (MPS) protocols. MPS is widely used in biomedical research and clinical diagnostics as the fast and accurate tool for the detection of short genetic variations. However, identification of larger variations such as structure variants and copy number variations (CNV) is still being a challenge for targeted MPS. Some approaches and tools for structural variants detection were proposed, but they have limitations and often require datasets of certain type, size and expected number of amplicons affected by CNVs. In the paper, we describe novel algorithm for high-resolution germinal CNV detection in the PCR-enriched targeted sequencing data and present accompanying tool. We have developed a machine learning algorithm for the detection of large duplications and deletions in the targeted sequencing data generated with PCR-based enrichment step. We have performed verification studies and established the algorithm's sensitivity and specificity. We have compared developed tool with other available methods applicable for the described data and revealed its higher performance. We showed that our method has high specificity and sensitivity for high-resolution copy number detection in targeted sequencing data using large cohort of samples.

  12. Evaluating the effectiveness of gloves in reducing the hazards of hand-transmitted vibration.

    PubMed

    Griffin, M J

    1998-05-01

    A method of evaluating the effectiveness of gloves in reducing the hazards of hand-transmitted vibration is proposed. The glove isolation effectiveness was calculated from: (a) the measured transmissibility of a glove, (b) the vibration spectrum on the handle of a specific tool (or class of tools), and (c) the frequency weighting indicating the degree to which different frequencies of vibration cause injury. With previously reported tool vibration spectra and glove transmissibilities (from 10-1000 Hz), the method was used to test 10 gloves with 20 different powered tools. The frequency weighting for hand-transmitted vibration advocated in British standard 6842 (1987) and international standard 5349 (1986) greatly influences the apparent isolation effectiveness of gloves. With the frequency weighting, the gloves had little effect on the transmission of vibration to the hand from most of the tools. Only for two or three tools (those dominated by high frequency vibration) did any glove provide useful attenuation. Without the frequency weighting, some gloves showed useful attenuation of the vibration on most powered tools. In view of the uncertain effect of the vibration frequency in the causation of disorders from hand-transmitted vibration, it is provisionally suggested that the wearing of a glove by the user of a particular vibratory tool could be encouraged if the glove reduces the transmission of vibration when it is evaluated without the frequency weighting and does not increase the vibration when it is evaluated with the frequency weighting. A current international standard for the measurement and evaluation of the vibration transmitted by gloves can classify a glove as an antivibration glove when it provides no useful attenuation of vibration, whereas a glove providing useful attenuation of vibration on a specific tool can fail the test.

  13. Pediatric intensive care unit admission tool: a colorful approach.

    PubMed

    Biddle, Amy

    2007-12-01

    This article discusses the development, implementation, and utilization of our institution's Pediatric Intensive Care Unit (PICU) Color-Coded Admission Status Tool. Rather than the historical method of identifying a maximum number of staffed beds, a tool was developed to color code the PICU's admission status. Previous methods had been ineffective and led to confusion between the PICU leadership team and the administration. The tool includes the previously missing components of staffing and acuity, which are essential in determining admission capability. The PICU tool has three colored levels: green indicates open for admissions; yellow, admission alert resulting from available beds or because staffing is not equal to the projected patient numbers or required acuity; and red, admissions on hold because only one trauma or arrest bed is available or staffing is not equal to the projected acuity. Yellow and red designations require specific actions and the medical director's approval. The tool has been highly successful and significantly impacted nursing with the inclusion of the essential component of nurse staffing necessary in determining bed availability.

  14. Outcome Measures in Spinal Cord Injury

    PubMed Central

    Alexander, Marcalee S.; Anderson, Kim; Biering-Sorensen, Fin; Blight, Andrew R.; Brannon, Ruth; Bryce, Thomas; Creasey, Graham; Catz, Amiram; Curt, Armin; Donovan, William; Ditunno, John; Ellaway, Peter; Finnerup, Nanna B.; Graves, Daniel E.; Haynes, Beth Ann; Heinemann, Allen W.; Jackson, Amie B.; Johnston, Mark; Kalpakjian, Claire Z.; Kleitman, Naomi; Krassioukov, Andrei; Krogh, Klaus; Lammertse, Daniel; Magasi, Susan; Mulcahey, MJ; Schurch, Brigitte; Sherwood, Arthur; Steeves, John D.; Stiens, Steven; Tulsky, David S.; van Hedel, Hubertus J.A.; Whiteneck, Gale

    2009-01-01

    Study Design review by the Spinal Cord Outcomes Partnership Endeavor (SCOPE), which is a broad-based international consortium of scientists and clinical researchers representing academic institutions, industry, government agencies, not-for-profit organizations and foundations. Objectives assessment of current and evolving tools for evaluating human spinal cord injury (SCI) outcomes for both clinical diagnosis and clinical research studies. Methods a framework for the appraisal of evidence of metric properties was used to examine outcome tools or tests for accuracy, sensitivity, reliability and validity for human SCI. Results imaging, neurological, functional, autonomic, sexual health, bladder/bowel, pain, and psycho-social tools were evaluated. Several specific tools for human SCI studies have or are being developed to allow the more accurate determination for a clinically meaningful benefit (improvement in functional outcome or quality of life) being achieved as a result of a therapeutic intervention. Conclusion significant progress has been made, but further validation studies are required to identify the most appropriate tools for specific targets in a human SCI study or clinical trial. PMID:19381157

  15. 77 FR 12002 - Mount Baker-Snoqualmie National Forest Site-Specific Invasive Plant Treatment Project and Forest...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-28

    ... these tools, including additional herbicides and application methods to increase treatment effectiveness... organisms than previously approved herbicides and higher effectiveness on particular invasive plants. Thus... examples demonstrate why additional herbicides, methods, and protocols are needed to improve treatment...

  16. The ratio method: A new tool to study one-neutron halo nuclei

    DOE PAGES

    Capel, Pierre; Johnson, R. C.; Nunes, F. M.

    2013-10-02

    Recently a new observable to study halo nuclei was introduced, based on the ratio between breakup and elastic angular cross sections. This new observable is shown by the analysis of specific reactions to be independent of the reaction mechanism and to provide nuclear-structure information of the projectile. Here we explore the details of this ratio method, including the sensitivity to binding energy and angular momentum of the projectile. We also study the reliability of the method with breakup energy. Lastly, we provide guidelines and specific examples for experimentalists who wish to apply this method.

  17. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  18. The development of advanced manufacturing systems

    NASA Astrophysics Data System (ADS)

    Doumeingts, Guy; Vallespir, Bruno; Darricau, Didier; Roboam, Michel

    Various methods for the design of advanced manufacturing systems (AMSs) are reviewed. The specifications for AMSs and problems inherent in their development are first discussed. Three models, the Computer Aided Manufacturing-International model, the National Bureau of Standards model, and the GRAI model, are considered in detail. Hierarchical modeling tools such as structured analysis and design techniques, Petri nets, and the Icam definition method are used in the development of integrated manufacturing models. Finally, the GRAI method is demonstrated in the design of specifications for the production management system of the Snecma AMS.

  19. A grid matrix-based Raman spectroscopic method to characterize different cell milieu in biopsied axillary sentinel lymph nodes of breast cancer patients.

    PubMed

    Som, Dipasree; Tak, Megha; Setia, Mohit; Patil, Asawari; Sengupta, Amit; Chilakapati, C Murali Krishna; Srivastava, Anurag; Parmar, Vani; Nair, Nita; Sarin, Rajiv; Badwe, R

    2016-01-01

    Raman spectroscopy which is based upon inelastic scattering of photons has a potential to emerge as a noninvasive bedside in vivo or ex vivo molecular diagnostic tool. There is a need to improve the sensitivity and predictability of Raman spectroscopy. We developed a grid matrix-based tissue mapping protocol to acquire cellular-specific spectra that also involved digital microscopy for localizing malignant and lymphocytic cells in sentinel lymph node biopsy sample. Biosignals acquired from specific cellular milieu were subjected to an advanced supervised analytical method, i.e., cross-correlation and peak-to-peak ratio in addition to PCA and PC-LDA. We observed decreased spectral intensity as well as shift in the spectral peaks of amides and lipid bands in the completely metastatic (cancer cells) lymph nodes with high cellular density. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to create an automated smart diagnostic tool for bench side screening of sampled lymph nodes. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to develop an automated smart diagnostic tool for bench side screening of sampled lymph nodes supported by ongoing global research in developing better technology and signal and big data processing algorithms.

  20. PACS project management utilizing web-based tools

    NASA Astrophysics Data System (ADS)

    Patel, Sunil; Levin, Brad; Gac, Robert J., Jr.; Harding, Douglas, Jr.; Chacko, Anna K.; Radvany, Martin; Romlein, John R.

    2000-05-01

    As Picture Archiving and Communications Systems (PACS) implementations become more widespread, the management of deploying large, multi-facility PACS will become a more frequent occurrence. The tools and usability of the World Wide Web to disseminate project management information obviates time, distance, participant availability, and data format constraints, allowing for the effective collection and dissemination of PACS planning, implementation information, for a potentially limitless number of concurrent PACS sites. This paper will speak to tools, such as (1) a topic specific discussion board, (2) a 'restricted' Intranet, within a 'project' Intranet. We will also discuss project specific methods currently in use in a leading edge, regional PACS implementation concerning the sharing of project schedules, physical drawings, images of implementations, site-specific data, point of contacts lists, project milestones, and a general project overview. The individual benefits realized for the end user from each tool will also be covered. These details will be presented, balanced with a spotlight on communication as a critical component of any project management undertaking. Using today's technology, the web arguably provides the most cost and resource effective vehicle to facilitate the broad based, interactive sharing of project information.

  1. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  2. 16th IHIW: Global analysis of registry HLA haplotypes from 20 Million individuals: Report from the IHIW Registry Diversity Group

    PubMed Central

    Maiers, M; Gragert, L; Madbouly, A; Steiner, D; Marsh, S G E; Gourraud, P-A; Oudshoorn, M; Zanden, H; Schmidt, A H; Pingel, J; Hofmann, J; Müller, C; Eberhard, H-P

    2013-01-01

    This project has the goal to validate bioinformatics methods and tools for HLA haplotype frequency analysis specifically addressing unique issues of haematopoietic stem cell registry data sets. In addition to generating new methods and tools for the analysis of registry data sets, the intent is to produce a comprehensive analysis of HLA data from 20 million donors from the Bone Marrow Donors Worldwide (BMDW) database. This report summarizes the activity on this project as of the 16IHIW meeting in Liverpool. PMID:23280139

  3. Priming methods in semantics and pragmatics.

    PubMed

    Maldonado, Mora; Spector, Benjamin; Chemla, Emmanuel

    2017-01-01

    Structural priming is a powerful method to inform linguistic theories. We argue that this method extends nicely beyond syntax to theories of meaning. Priming, however, should still be seen as only one of the tools available for linguistic data collection. Specifically, because priming can occur at different, potentially conflicting levels, it cannot detect every aspect of linguistic representations.

  4. Measuring attitudes towards the dying process: A systematic review of tools.

    PubMed

    Groebe, Bernadette; Strupp, Julia; Eisenmann, Yvonne; Schmidt, Holger; Schlomann, Anna; Rietz, Christian; Voltz, Raymond

    2018-04-01

    At the end of life, anxious attitudes concerning the dying process are common in patients in Palliative Care. Measurement tools can identify vulnerabilities, resources and the need for subsequent treatment to relieve suffering and support well-being. To systematically review available tools measuring attitudes towards dying, their operationalization, the method of measurement and the methodological quality including generalizability to different contexts. Systematic review according to the PRISMA Statement. Methodological quality of tools assessed by standardized review criteria. MEDLINE, PsycINFO, PsyndexTests and the Health and Psychosocial Instruments were searched from their inception to April 2017. A total of 94 identified studies reported the development and/or validation of 44 tools. Of these, 37 were questionnaires and 7 alternative measurement methods (e.g. projective measures). In 34 of 37 questionnaires, the emotional evaluation (e.g. anxiety) towards dying is measured. Dying is operationalized in general items ( n = 20), in several specific aspects of dying ( n = 34) and as dying of others ( n = 14). Methodological quality of tools was reported inconsistently. Nine tools reported good internal consistency. Of 37 tools, 4 were validated in a clinical sample (e.g. terminal cancer; Huntington disease), indicating questionable generalizability to clinical contexts for most tools. Many tools exist to measure attitudes towards the dying process using different endpoints. This overview can serve as decision framework on which tool to apply in which contexts. For clinical application, only few tools were available. Further validation of existing tools and potential alternative methods in various populations is needed.

  5. Sensitivity and specificity of the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) and the Intensive Care Delirium Screening Checklist (ICDSC) for detecting post-cardiac surgery delirium: A single-center study in Japan.

    PubMed

    Nishimura, Katsuji; Yokoyama, Kanako; Yamauchi, Noriko; Koizumi, Masako; Harasawa, Nozomi; Yasuda, Taeko; Mimura, Chizuru; Igita, Hazuki; Suzuki, Eriko; Uchiide, Yoko; Seino, Yusuke; Nomura, Minoru; Yamazaki, Kenji; Ishigooka, Jun

    2016-01-01

    To compare the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) and the Intensive Care Delirium Screening Checklist (ICDSC) for detecting post-cardiac surgery delirium. These tools have not been tested in a specialized cardio-surgical ICU. Sensitivities and specificities of each tool were assessed in a cardio-surgical ICU in Japan by two trained nurses independently. Results were compared with delirium diagnosed by psychiatrists using the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition, Text Revision. There were 110 daily, paired assessments in 31 patients. The CAM-ICU showed 38% sensitivity and 100% specificity for both nurses. All 20 false-negative cases resulted from high scores in the auditory attention screening in CAM-ICU. The ICDSC showed 97% and 94% sensitivity, and 97% and 91% specificity for the two nurses (cutoff ≥4). In a Japanese cardio-surgical ICU, the ICDSC had a higher sensitivity than the CAM-ICU. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  7. Contamination-Free Manufacturing: Tool Component Qualification, Verification and Correlation with Wafers

    NASA Astrophysics Data System (ADS)

    Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei

    2003-09-01

    As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.

  8. Patient-specific finite element modeling of bones.

    PubMed

    Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A

    2013-04-01

    Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.

  9. USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Schultz

    2012-09-01

    A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applyingmore » the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.« less

  10. Measures of fine motor skills in people with tremor disorders: appraisal and interpretation.

    PubMed

    Norman, Kathleen E; Héroux, Martin E

    2013-01-01

    People with Parkinson's disease, essential tremor, or other movement disorders involving tremor have changes in fine motor skills that are among the hallmarks of these diseases. Numerous measurement tools have been created and other methods devised to measure such changes in fine motor skills. Measurement tools may focus on specific features - e.g., motor skills or dexterity, slowness in movement execution associated with parkinsonian bradykinesia, or magnitude of tremor. Less obviously, some tools may be better suited than others for specific goals such as detecting subtle dysfunction early in disease, revealing aspects of brain function affected by disease, or tracking changes expected from treatment or disease progression. The purpose of this review is to describe and appraise selected measurement tools of fine motor skills appropriate for people with tremor disorders. In this context, we consider the tools' content - i.e., what movement features they focus on. In addition, we consider how measurement tools of fine motor skills relate to measures of a person's disease state or a person's function. These considerations affect how one should select and interpret the results of these tools in laboratory and clinical contexts.

  11. Design and Testing of an Air Force Services Mystery Shopping Program.

    DTIC Science & Technology

    1998-11-01

    Base level Air Force Services’ lodging and foodservice activities use limited service quality measurement tools to determine customer perceptions of... service quality . These tools, specifically management observation and customer comment cards, do not provide a complete picture of service quality . Other... service quality measurement methods such as mystery shopping are rarely used. Bases do not consider using mystery shopping programs because of the

  12. Cutting force measurement of electrical jigsaw by strain gauges

    NASA Astrophysics Data System (ADS)

    Kazup, L.; Varadine Szarka, A.

    2016-11-01

    This paper describes a measuring method based on strain gauges for accurate specification of electric jigsaw's cutting force. The goal of the measurement is to provide an overall perspective about generated forces in a jigsaw's gearbox during a cutting period. The lifetime of the tool is affected by these forces primarily. This analysis is part of the research and development project aiming to develop a special linear magnetic brake for realizing automatic lifetime tests of electric jigsaws or similar handheld tools. The accurate specification of cutting force facilitates to define realistic test cycles during the automatic lifetime test. The accuracy and precision resulted by the well described cutting force characteristic and the possibility of automation provide new dimension for lifetime testing of the handheld tools with alternating movement.

  13. The development and validation of a meta-tool for quality appraisal of public health evidence: Meta Quality Appraisal Tool (MetaQAT).

    PubMed

    Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V

    2016-07-01

    Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Improving Patient Experience and Primary Care Quality for Patients With Complex Chronic Disease Using the Electronic Patient-Reported Outcomes Tool: Adopting Qualitative Methods Into a User-Centered Design Approach.

    PubMed

    Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl

    2016-02-18

    Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.

  15. Java PathExplorer: A Runtime Verification Tool

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We describe recent work on designing an environment called Java PathExplorer for monitoring the execution of Java programs. This environment facilitates the testing of execution traces against high level specifications, including temporal logic formulae. In addition, it contains algorithms for detecting classical error patterns in concurrent programs, such as deadlocks and data races. An initial prototype of the tool has been applied to the executive module of the planetary Rover K9, developed at NASA Ames. In this paper we describe the background and motivation for the development of this tool, including comments on how it relates to formal methods tools as well as to traditional testing, and we then present the tool itself.

  16. Nuclease Target Site Selection for Maximizing On-target Activity and Minimizing Off-target Effects in Genome Editing

    PubMed Central

    Lee, Ciaran M; Cradick, Thomas J; Fine, Eli J; Bao, Gang

    2016-01-01

    The rapid advancement in targeted genome editing using engineered nucleases such as ZFNs, TALENs, and CRISPR/Cas9 systems has resulted in a suite of powerful methods that allows researchers to target any genomic locus of interest. A complementary set of design tools has been developed to aid researchers with nuclease design, target site selection, and experimental validation. Here, we review the various tools available for target selection in designing engineered nucleases, and for quantifying nuclease activity and specificity, including web-based search tools and experimental methods. We also elucidate challenges in target selection, especially in predicting off-target effects, and discuss future directions in precision genome editing and its applications. PMID:26750397

  17. Randomized DNA libraries construction tool: a new 3-bp 'frequent cutter' TthHB27I/sinefungin endonuclease with chemically-induced specificity.

    PubMed

    Krefft, Daria; Papkov, Aliaksei; Prusinowski, Maciej; Zylicz-Stachula, Agnieszka; Skowron, Piotr M

    2018-05-11

    Acoustic or hydrodynamic shearing, sonication and enzymatic digestion are used to fragment DNA. However, these methods have several disadvantages, such as DNA damage, difficulties in fragmentation control, irreproducibility and under-representation of some DNA segments. The DNA fragmentation tool would be a gentle enzymatic method, offering cleavage frequency high enough to eliminate DNA fragments distribution bias and allow for easy control of partial digests. Only three such frequently cleaving natural restriction endonucleases (REases) were discovered: CviJI, SetI and FaiI. Therefore, we have previously developed two artificial enzymatic specificities, cleaving DNA approximately every ~ 3-bp: TspGWI/sinefungin (SIN) and TaqII/SIN. In this paper we present the third developed specificity: TthHB27I/SIN(SAM) - a new genomic tool, based on Type IIS/IIC/IIG Thermus-family REases-methyltransferases (MTases). In the presence of dimethyl sulfoxide (DMSO) and S-adenosyl-L-methionine (SAM) or its analogue SIN, the 6-bp cognate TthHB27I recognition sequence 5'-CAARCA-3' is converted into a combined 3.2-3.0-bp 'site' or its statistical equivalent, while a cleavage distance of 11/9 nt is retained. Protocols for various modes of limited DNA digestions were developed. In the presence of DMSO and SAM or SIN, TthHB27I is transformed from rare 6-bp cutter to a very frequent one, approximately 3-bp. Thus, TthHB27I/SIN(SAM) comprises a new tool in the very low-represented segment of such prototype REases specificities. Moreover, this modified TthHB27I enzyme is uniquely suited for controlled DNA fragmentation, due to partial DNA cleavage, which is an inherent feature of the Thermus-family enzymes. Such tool can be used for quasi-random libraries generation as well as for other DNA manipulations, requiring high frequency cleavage and uniform distribution of cuts along DNA.

  18. CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.

    PubMed

    Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro

    2017-03-30

    Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.

  19. Teaching meta-analysis using MetaLight.

    PubMed

    Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark

    2012-10-18

    Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.

  20. Molecular tools for diagnosis of visceral leishmaniasis: systematic review and meta-analysis of diagnostic test accuracy.

    PubMed

    de Ruiter, C M; van der Veer, C; Leeflang, M M G; Deborggraeve, S; Lucas, C; Adams, E R

    2014-09-01

    Molecular methods have been proposed as highly sensitive tools for the detection of Leishmania parasites in visceral leishmaniasis (VL) patients. Here, we evaluate the diagnostic accuracy of these tools in a meta-analysis of the published literature. The selection criteria were original studies that evaluate the sensitivities and specificities of molecular tests for diagnosis of VL, adequate classification of study participants, and the absolute numbers of true positives and negatives derivable from the data presented. Forty studies met the selection criteria, including PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), and loop-mediated isothermal amplification (LAMP). The sensitivities of the individual studies ranged from 29 to 100%, and the specificities ranged from 25 to 100%. The pooled sensitivity of PCR in whole blood was 93.1% (95% confidence interval [CI], 90.0 to 95.2), and the specificity was 95.6% (95% CI, 87.0 to 98.6). The specificity was significantly lower in consecutive studies, at 63.3% (95% CI, 53.9 to 71.8), due either to true-positive patients not being identified by parasitological methods or to the number of asymptomatic carriers in areas of endemicity. PCR for patients with HIV-VL coinfection showed high diagnostic accuracy in buffy coat and bone marrow, ranging from 93.1 to 96.9%. Molecular tools are highly sensitive assays for Leishmania detection and may contribute as an additional test in the algorithm, together with a clear clinical case definition. We observed wide variety in reference standards and study designs and now recommend consecutively designed studies. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  1. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    NASA Astrophysics Data System (ADS)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  2. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    PubMed

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  3. Simultaneous neuron- and astrocyte-specific fluorescent marking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulze, Wiebke; Hayata-Takano, Atsuko; Kamo, Toshihiko

    2015-03-27

    Systematic and simultaneous analysis of multiple cell types in the brain is becoming important, but such tools have not yet been adequately developed. Here, we aimed to generate a method for the specific fluorescent labeling of neurons and astrocytes, two major cell types in the brain, and we have developed lentiviral vectors to express the red fluorescent protein tdTomato in neurons and the enhanced green fluorescent protein (EGFP) in astrocytes. Importantly, both fluorescent proteins are fused to histone 2B protein (H2B) to confer nuclear localization to distinguish between single cells. We also constructed several expression constructs, including a tandem alignmentmore » of the neuron- and astrocyte-expression cassettes for simultaneous labeling. Introducing these vectors and constructs in vitro and in vivo resulted in cell type-specific and nuclear-localized fluorescence signals enabling easy detection and distinguishability of neurons and astrocytes. This tool is expected to be utilized for the simultaneous analysis of changes in neurons and astrocytes in healthy and diseased brains. - Highlights: • We develop a method for the specific fluorescent labeling of neurons and astrocytes. • Neuron-specific labeling is achieved using Scg10 and synapsin promoters. • Astrocyte-specific labeling is generated using the minimal GFAP promoter. • Nuclear localization of fluorescent proteins is achieved with histone 2B protein.« less

  4. DNA-binding specificity prediction with FoldX.

    PubMed

    Nadra, Alejandro D; Serrano, Luis; Alibés, Andreu

    2011-01-01

    With the advent of Synthetic Biology, a field between basic science and applied engineering, new computational tools are needed to help scientists reach their goal, their design, optimizing resources. In this chapter, we present a simple and powerful method to either know the DNA specificity of a wild-type protein or design new specificities by using the protein design algorithm FoldX. The only basic requirement is having a good resolution structure of the complex. Protein-DNA interaction design may aid the development of new parts designed to be orthogonal, decoupled, and precise in its target. Further, it could help to fine-tune the systems in terms of specificity, discrimination, and binding constants. In the age of newly developed devices and invented systems, computer-aided engineering promises to be an invaluable tool. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Human Factors Engineering as a System in the Vision for Exploration

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Smith, Danielle; Holden, Kritina

    2006-01-01

    In order to accomplish NASA's Vision for Exploration, while assuring crew safety and productivity, human performance issues must be well integrated into system design from mission conception. To that end, a two-year Technology Development Project (TDP) was funded by NASA Headquarters to develop a systematic method for including the human as a system in NASA's Vision for Exploration. The specific goals of this project are to review current Human Systems Integration (HSI) standards (i.e., industry, military, NASA) and tailor them to selected NASA Exploration activities. Once the methods are proven in the selected domains, a plan will be developed to expand the effort to a wider scope of Exploration activities. The methods will be documented for inclusion in NASA-specific documents (such as the Human Systems Integration Standards, NASA-STD-3000) to be used in future space systems. The current project builds on a previous TDP dealing with Human Factors Engineering processes. That project identified the key phases of the current NASA design lifecycle, and outlined the recommended HFE activities that should be incorporated at each phase. The project also resulted in a prototype of a webbased HFE process tool that could be used to support an ideal HFE development process at NASA. This will help to augment the limited human factors resources available by providing a web-based tool that explains the importance of human factors, teaches a recommended process, and then provides the instructions, templates and examples to carry out the process steps. The HFE activities identified by the previous TDP are being tested in situ for the current effort through support to a specific NASA Exploration activity. Currently, HFE personnel are working with systems engineering personnel to identify HSI impacts for lunar exploration by facilitating the generation of systemlevel Concepts of Operations (ConOps). For example, medical operations scenarios have been generated for lunar habitation in order to identify HSI requirements for the lunar communications architecture. Throughout these ConOps exercises, HFE personnel are testing various tools and methodologies that have been identified in the literature. A key part of the effort is the identification of optimal processes, methods, and tools for these early development phase activities, such as ConOps, requirements development, and early conceptual design. An overview of the activities completed thus far, as well as the tools and methods investigated will be presented.

  6. Mixed methods evaluation of a quality improvement and audit tool for nurse-to-nurse bedside clinical handover in ward settings.

    PubMed

    Redley, Bernice; Waugh, Rachael

    2018-04-01

    Nurse bedside handover quality is influenced by complex interactions related to the content, processes used and the work environment. Audit tools are seldom tested in 'real' settings. Examine the reliability, validity and usability of a quality improvement tool for audit of nurse bedside handover. Naturalistic, descriptive, mixed-methods. Six inpatient wards at a single large not-for-profit private health service in Victoria, Australia. Five nurse experts and 104 nurses involved in 199 change-of-shift bedside handovers. A focus group with experts and pilot test were used to examine content and face validity, and usability of the handover audit tool. The tool was examined for inter-rater reliability and usability using observation audits of handovers across six wards. Data were collected in 2013-2014. Two independent observers for 72 audits demonstrated acceptable inter-observer agreement for 27 (77%) items. Reliability was weak for items examining the handover environment. Seventeen items were not observed reflecting gaps in practices. Across 199 observation audits, gaps in nurse bedside handover practice most often related to process and environment, rather than content items. Usability was impacted by high observer burden, familiarity and non-specific illustrative behaviours. The reliability and validity of most items to audit handover content was acceptable. Gaps in practices for process and environment items were identified. Context specific exemplars and reducing the items used at each handover audit can enhance usability. Further research is needed to develop context specific exemplars and undertake additional reliability testing using a wide range of handover settings. CONTRIBUTION OF THE PAPER. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. In-situ determination of residual specific activity in activated concrete walls of a PET-cyclotron room

    NASA Astrophysics Data System (ADS)

    Matsumura, H.; Toyoda, A.; Masumoto, K.; Yoshida, G.; Yagishita, T.; Nakabayashi, T.; Sasaki, H.; Matsumura, K.; Yamaya, Y.; Miyazaki, Y.

    2018-06-01

    In the decommissioning work for concrete walls of PET-cyclotron rooms, an in-situ measurement is expected to be useful for obtaining a contour map of the specific activity on the walls without destroying the structure. In this study, specific activities of γ-ray-emitting radionuclides in concrete walls were determined by using an in-situ measurement method employing a portable Ge semiconductor detector, and compared with the specific activity obtained using the sampling measurement method, at the Medical and Pharmacological Research Center Foundation in Hakui, Ishikawa, Japan. Accordingly, the specific activity could be determined by the in-situ determination method. Since there is a clear correlation between the total specific activity of γ-ray-emitting radionuclides and contact dose rate, the specific activity can be determined approximately by contact dose-rate measurement using a NaI scintillation survey meter. The specific activity of each γ-ray-emitting radionuclide can also be estimated from the contact dose rate using a NaI scintillation survey meter. The in-situ measurement method is a powerful tool for the decommissioning of the PET cyclotron room.

  8. Optimization in Cardiovascular Modeling

    NASA Astrophysics Data System (ADS)

    Marsden, Alison L.

    2014-01-01

    Fluid mechanics plays a key role in the development, progression, and treatment of cardiovascular disease. Advances in imaging methods and patient-specific modeling now reveal increasingly detailed information about blood flow patterns in health and disease. Building on these tools, there is now an opportunity to couple blood flow simulation with optimization algorithms to improve the design of surgeries and devices, incorporating more information about the flow physics in the design process to augment current medical knowledge. In doing so, a major challenge is the need for efficient optimization tools that are appropriate for unsteady fluid mechanics problems, particularly for the optimization of complex patient-specific models in the presence of uncertainty. This article reviews the state of the art in optimization tools for virtual surgery, device design, and model parameter identification in cardiovascular flow and mechanobiology applications. In particular, it reviews trade-offs between traditional gradient-based methods and derivative-free approaches, as well as the need to incorporate uncertainties. Key future challenges are outlined, which extend to the incorporation of biological response and the customization of surgeries and devices for individual patients.

  9. State-of-the-Art Fusion-Finder Algorithms Sensitivity and Specificity

    PubMed Central

    Carrara, Matteo; Beccuti, Marco; Lazzarato, Fulvio; Cavallo, Federica; Cordero, Francesca; Donatelli, Susanna; Calogero, Raffaele A.

    2013-01-01

    Background. Gene fusions arising from chromosomal translocations have been implicated in cancer. RNA-seq has the potential to discover such rearrangements generating functional proteins (chimera/fusion). Recently, many methods for chimeras detection have been published. However, specificity and sensitivity of those tools were not extensively investigated in a comparative way. Results. We tested eight fusion-detection tools (FusionHunter, FusionMap, FusionFinder, MapSplice, deFuse, Bellerophontes, ChimeraScan, and TopHat-fusion) to detect fusion events using synthetic and real datasets encompassing chimeras. The comparison analysis run only on synthetic data could generate misleading results since we found no counterpart on real dataset. Furthermore, most tools report a very high number of false positive chimeras. In particular, the most sensitive tool, ChimeraScan, reports a large number of false positives that we were able to significantly reduce by devising and applying two filters to remove fusions not supported by fusion junction-spanning reads or encompassing large intronic regions. Conclusions. The discordant results obtained using synthetic and real datasets suggest that synthetic datasets encompassing fusion events may not fully catch the complexity of RNA-seq experiment. Moreover, fusion detection tools are still limited in sensitivity or specificity; thus, there is space for further improvement in the fusion-finder algorithms. PMID:23555082

  10. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  11. Developing and validating a perinatal depression screening tool in Kenya blending Western criteria with local idioms: A mixed methods study.

    PubMed

    Green, Eric P; Tuli, Hawa; Kwobah, Edith; Menya, D; Chesire, Irene; Schmidt, Christina

    2018-03-01

    Routine screening for perinatal depression is not common in most primary health care settings. The U.S. Preventive Services Task Force only recently updated their recommendation on depression screening to specifically recommend screening during the pre- and postpartum periods. While practitioners in high-income countries can respond to this new recommendation by implementing one of several existing depression screening tools developed in Western contexts, such as the Edinburgh Postnatal Depression Scale (EPDS) or the Patient Health Questionnaire-9 (PHQ-9), these tools lack strong evidence of cross-cultural equivalence, validity for case finding, and precision in measuring response to treatment in developing countries. Thus, there is a critical need to develop and validate new screening tools for perinatal depression that can be used by lay health workers, primary health care personnel, and patients. Working in rural Kenya, we used free listing, card sorting, and item analysis methods to develop a locally-relevant screening tool that blended Western psychiatric concepts with local idioms of distress. We conducted a validation study with a random sample of 193 pregnant women and new mothers to test the diagnostic accuracy of this scale along with the EPDS and PHQ-9. The sensitivity/specificity of the EPDS and PHQ-9 was estimated to be 0.70/0.72 and 0.70/0.73, respectively. This compared to sensitivity/specificity of 0.90/0.90 for a new 9-item locally-developed tool called the Perinatal Depression Screening (PDEPS). Across these three tools, internal consistency reliability ranged from 0.77 to 0.81 and test-retest reliability ranged from 0.57 to 0.67. The prevalence of depression ranges from 5.2% to 6.2% depending on the clinical reference standard. The EPDS and PHQ-9 are valid and reliable screening tools for perinatal depression in rural Western Kenya, the PDEPS may be a more useful alternative. At less than 10%, the prevalence of depression in this region appears to be lower than other published estimates for African and other low-income countries. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. hSAGEing: an improved SAGE-based software for identification of human tissue-specific or common tumor markers and suppressors.

    PubMed

    Yang, Cheng-Hong; Chuang, Li-Yeh; Shih, Tsung-Mu; Chang, Hsueh-Wei

    2010-12-17

    SAGE (serial analysis of gene expression) is a powerful method of analyzing gene expression for the entire transcriptome. There are currently many well-developed SAGE tools. However, the cross-comparison of different tissues is seldom addressed, thus limiting the identification of common- and tissue-specific tumor markers. To improve the SAGE mining methods, we propose a novel function for cross-tissue comparison of SAGE data by combining the mathematical set theory and logic with a unique "multi-pool method" that analyzes multiple pools of pair-wise case controls individually. When all the settings are in "inclusion", the common SAGE tag sequences are mined. When one tissue type is in "inclusion" and the other types of tissues are not in "inclusion", the selected tissue-specific SAGE tag sequences are generated. They are displayed in tags-per-million (TPM) and fold values, as well as visually displayed in four kinds of scales in a color gradient pattern. In the fold visualization display, the top scores of the SAGE tag sequences are provided, along with cluster plots. A user-defined matrix file is designed for cross-tissue comparison by selecting libraries from publically available databases or user-defined libraries. The hSAGEing tool provides a combination of friendly cross-tissue analysis and an interface for comparing SAGE libraries for the first time. Some up- or down-regulated genes with tissue-specific or common tumor markers and suppressors are identified computationally. The tool is useful and convenient for in silico cancer transcriptomic studies and is freely available at http://bio.kuas.edu.tw/hSAGEing.

  13. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Dependency between removal characteristics and defined measurement categories of pellets

    NASA Astrophysics Data System (ADS)

    Vogt, C.; Rohrbacher, M.; Rascher, R.; Sinzinger, S.

    2015-09-01

    Optical surfaces are usually machined by grinding and polishing. To achieve short polishing times it is necessary to grind with best possible form accuracy and with low sub surface damages. This is possible by using very fine grained grinding tools for the finishing process. These however often show time dependent properties regarding cutting ability in conjunction with tool wear. Fine grinding tools in the optics are often pellet-tools. For a successful grinding process the tools must show a constant self-sharpening performance. A constant, at least predictable wear and cutting behavior is crucial for a deterministic machining. This work describes a method to determine the characteristics of pellet grinding tools by tests conducted with a single pellet. We investigate the determination of the effective material removal rate and the derivation of the G-ratio. Especially the change from the newly dressed via the quasi-stationary to the worn status of the tool is described. By recording the achieved roughness with the single pellet it is possible to derive the roughness expect from a series pellet tool made of pellets with the same specification. From the results of these tests the usability of a pellet grinding tool for a specific grinding task can be determined without testing a comparably expensive serial tool. The results are verified by a production test with a serial tool under series conditions. The collected data can be stored and used in an appropriate data base for tool characteristics and be combined with useful applications.

  15. Quality of the parent-child interaction in young children with type 1 diabetes mellitus: study protocol

    PubMed Central

    2011-01-01

    Background In young children with type 1 diabetes mellitus (T1DM) parents have full responsibility for the diabetes-management of their child (e.g. blood glucose monitoring, and administering insulin). Behavioral tasks in childhood, such as developing autonomy, and oppositional behavior (e.g. refusing food) may interfere with the diabetes-management to achieve an optimal blood glucose control. Furthermore, higher blood glucose levels are related to more behavioral problems. So parents might need to negotiate with their child on the diabetes-management to avoid this direct negative effect. This interference, the negotiations, and the parent's responsibility for diabetes may negatively affect the quality of parent-child interaction. Nevertheless, there is little knowledge about the quality of interaction between parents and young children with T1DM, and the possible impact this may have on glycemic control and psychosocial functioning of the child. While widely used global parent-child interaction observational methods are available, there is a need for an observational tool specifically tailored to the interaction patterns of parents and children with T1DM. The main aim of this study is to construct a disease-specific observational method to assess diabetes-specific parent-child interaction. Additional aim is to explore whether the quality of parent-child interactions is associated with the glycemic control, and psychosocial functioning (resilience, behavioral problems, and quality of life). Methods/Design First, we will examine which situations are most suitable for observing diabetes-specific interactions. Then, these situations will be video-taped in a pilot study (N = 15). Observed behaviors are described into rating scales, with each scale describing characteristics of parent-child interactional behaviors. Next, we apply the observational tool on a larger scale for further evaluation of the instrument (N = 120). The parents are asked twice (with two years in between) to fill out questionnaires about psychosocial functioning of their child with T1DM. Furthermore, glycemic control (HbA1c) will be obtained from their medical records. Discussion A disease-specific observational tool will enable the detailed assessment of the quality of diabetes-specific parent-child interactions. The availability of such a tool will facilitate future (intervention) studies that will yield more knowledge about impact of parent-child interactions on psychosocial functioning, and glycemic control of children with T1DM. PMID:21492413

  16. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  17. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  18. WILDLIFE RISK ASSESSMENT: DEVELOPMENT OF METHODS TO ASSESS THE EFFECTS OF MERCURY AND HABITAT ALTERATION ON POPULATIONS OF AQUATIC-DEPENDENT WILDLIFE

    EPA Science Inventory

    NHEERL is conducting a demonstration project to develop tools and approaches for assessing the risks of multiple stressors to populations of piscivorous wildlife, leading to the development of risk-based criteria. Specifically, we are developing methods and approaches to assess...

  19. Restrictions in Means for Suicide: An Effective Tool in Preventing Suicide: The Danish Experience

    ERIC Educational Resources Information Center

    Nordentoft, Merete; Qin, Ping; Helweg-Larsen, Karin

    2007-01-01

    Restriction of means for suicide is an important part of suicide preventive strategies in different countries. The effect on method-specific suicide rate and overall suicide rate of restrictions on availability of carbon monoxide, barbiturates, and dextropropoxyphene was examined. From 1970 to 2000, overall suicide mortality and method-specific…

  20. A mixed methods approach to exploring the relationship between Norway rat (Rattus norvegicus) abundance and features of the urban environment in an inner-city neighborhood of Vancouver, Canada.

    PubMed

    Himsworth, Chelsea G; Parsons, Kirbee L; Feng, Alice Y T; Kerr, Thomas; Jardine, Claire M; Patrick, David M

    2014-01-01

    Urban rats (Rattus spp.) are among the most ubiquitous pest species in the world. Previous research has shown that rat abundance is largely determined by features of the environment; however, the specific urban environmental factors that influence rat population density within cities have yet to be clearly identified. Additionally, there are no well described tools or methodologies for conducting an in-depth evaluation of the relationship between urban rat abundance and the environment. In this study, we developed a systematic environmental observation tool using methods borrowed from the field of systematic social observation. This tool, which employed a combination of quantitative and qualitative methodologies, was then used to identify environmental factors associated with the relative abundance of Norway rats (Rattus norvegicus) in an inner-city neighborhood of Vancouver, Canada. Using a multivariate zero-inflated negative binomial model, we found that a variety of factors, including specific land use, building condition, and amount of refuse, were related to rat presence and abundance. Qualitative data largely supported and further clarified observed statistical relationships, but also identified conflicting and unique situations not easily captured through quantitative methods. Overall, the tool helped us to better understand the relationship between features of the urban environment and relative rat abundance within our study area and may useful for studying environmental determinants of zoonotic disease prevalence/distribution among urban rat populations in the future.

  1. A Mixed Methods Approach to Exploring the Relationship between Norway Rat (Rattus norvegicus) Abundance and Features of the Urban Environment in an Inner-City Neighborhood of Vancouver, Canada

    PubMed Central

    Himsworth, Chelsea G.; Parsons, Kirbee L.; Feng, Alice Y. T.; Kerr, Thomas; Jardine, Claire M.; Patrick, David M.

    2014-01-01

    Urban rats (Rattus spp.) are among the most ubiquitous pest species in the world. Previous research has shown that rat abundance is largely determined by features of the environment; however, the specific urban environmental factors that influence rat population density within cities have yet to be clearly identified. Additionally, there are no well described tools or methodologies for conducting an in-depth evaluation of the relationship between urban rat abundance and the environment. In this study, we developed a systematic environmental observation tool using methods borrowed from the field of systematic social observation. This tool, which employed a combination of quantitative and qualitative methodologies, was then used to identify environmental factors associated with the relative abundance of Norway rats (Rattus norvegicus) in an inner-city neighborhood of Vancouver, Canada. Using a multivariate zero-inflated negative binomial model, we found that a variety of factors, including specific land use, building condition, and amount of refuse, were related to rat presence and abundance. Qualitative data largely supported and further clarified observed statistical relationships, but also identified conflicting and unique situations not easily captured through quantitative methods. Overall, the tool helped us to better understand the relationship between features of the urban environment and relative rat abundance within our study area and may useful for studying environmental determinants of zoonotic disease prevalence/distribution among urban rat populations in the future. PMID:24830847

  2. Automating testbed documentation and database access using World Wide Web (WWW) tools

    NASA Technical Reports Server (NTRS)

    Ames, Charles; Auernheimer, Brent; Lee, Young H.

    1994-01-01

    A method for providing uniform transparent access to disparate distributed information systems was demonstrated. A prototype testing interface was developed to access documentation and information using publicly available hypermedia tools. The prototype gives testers a uniform, platform-independent user interface to on-line documentation, user manuals, and mission-specific test and operations data. Mosaic was the common user interface, and HTML (Hypertext Markup Language) provided hypertext capability.

  3. Mixed Methods Design Study Investigating the Use of a Music Authentic Performance Assessment Tool by High School Band Directors to Measure Student Musical Growth

    ERIC Educational Resources Information Center

    Beason, Christine F.

    2017-01-01

    This research project was designed to determine if the Model Cornerstone Assessment for Performance, Proficient level, published by the National Association for Music Education would be an appropriate tool to use to demonstrate student growth as one element of teacher evaluations, specifically the T-TESS. This study focused on four main research…

  4. Process Guide for Deburring Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frey, David L.

    This report is an updated and consolidated view of the current deburring processes at the Kansas City Plant (KCP). It includes specific examples of current burr problems and the methods used for their detection. Also included is a pictorial review of the large variety of available deburr tools, along with a complete numerical listing of existing tools and their descriptions. The process for deburring all the major part feature categories is discussed.

  5. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    DTIC Science & Technology

    2016-02-01

    proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that

  6. From design to manufacturing of asymmetric teeth gears using computer application

    NASA Astrophysics Data System (ADS)

    Suciu, F.; Dascalescu, A.; Ungureanu, M.

    2017-05-01

    The asymmetric cylindrical gears, with involutes teeth profiles having different base circle diameters, are nonstandard gears, used with the aim to obtain better function parameters for the active profile. We will expect that the manufacturing of these gears became possible only after the design and realization of some specific tools. The paper present how the computer aided design and applications developed in MATLAB, for obtain the geometrical parameters, in the same time for calculation some functional parameters like stress and displacements, transmission error, efficiency of the gears and the 2D models, generated with AUTOLISP applications, are used for computer aided manufacturing of asymmetric gears with standard tools. So the specific tools considered one of the disadvantages of these gears are not necessary and implicitly the expected supplementary costs are reduced. The calculus algorithm established for the asymmetric gear design application use the „direct design“ of the spur gears. This method offers the possibility of determining first the parameters of the gears, followed by the determination of the asymmetric gear rack’s parameters, based on those of the gears. Using original design method and computer applications have been determined the geometrical parameters, the 2D and 3D models of the asymmetric gears and on the base of these models have been manufacturing on CNC machine tool asymmetric gears.

  7. Image manipulation as research misconduct.

    PubMed

    Parrish, Debra; Noonan, Bridget

    2009-06-01

    A growing number of research misconduct cases handled by the Office of Research Integrity involve image manipulations. Manipulations may include simple image enhancements, misrepresenting an image as something different from what it is, and altering specific features of an image. Through a study of specific cases, the misconduct findings associated with image manipulation, detection methods and those likely to identify such manipulations, are discussed. This article explores sanctions imposed against guilty researchers and the factors that resulted in no misconduct finding although relevant images clearly were flawed. Although new detection tools are available for universities and journals to detect questionable images, this article explores why these tools have not been embraced.

  8. Wear and breakage monitoring of cutting tools by an optical method: theory

    NASA Astrophysics Data System (ADS)

    Li, Jianfeng; Zhang, Yongqing; Chen, Fangrong; Tian, Zhiren; Wang, Yao

    1996-10-01

    An essential part of a machining system in the unmanned flexible manufacturing system, is the ability to automatically change out tools that are worn or damaged. An optoelectronic method for in situ monitoring of the flank wear and breakage of cutting tools is presented. A flank wear estimation system is implemented in a laboratory environment, and its performance is evaluated through turning experiments. The flank wear model parameters that need to be known a priori are determined through several preliminary experiments, or from data available in the literature. The resulting cutting conditions are typical of those used in finishing cutting operations. Through time and amplitude domain analysis of the cutting tool wear states and breakage states, it is found that the original signal digital specificity (sigma) 2x and the self correlation coefficient (rho) (m) can reflect the change regularity of the cutting tool wear and break are determined, but which is not enough due to the complexity of the wear and break procedure of cutting tools. Time series analysis and frequency spectrum analysis will be carried out, which will be described in the later papers.

  9. Improving Patient Experience and Primary Care Quality for Patients With Complex Chronic Disease Using the Electronic Patient-Reported Outcomes Tool: Adopting Qualitative Methods Into a User-Centered Design Approach

    PubMed Central

    Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl

    2016-01-01

    Background Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. Objective This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Methods Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Results Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis—Possible, Implementable, (to be) Challenged, (to be) Killed—guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Conclusions Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers. PMID:26892952

  10. Methods for Optimizing CRISPR-Cas9 Genome Editing Specificity

    PubMed Central

    Tycko, Josh; Myer, Vic E.; Hsu, Patrick D.

    2016-01-01

    Summary Advances in the development of delivery, repair, and specificity strategies for the CRISPR-Cas9 genome engineering toolbox are helping researchers understand gene function with unprecedented precision and sensitivity. CRISPR-Cas9 also holds enormous therapeutic potential for the treatment of genetic disorders by directly correcting disease-causing mutations. Although the Cas9 protein has been shown to bind and cleave DNA at off-target sites, the field of Cas9 specificity is rapidly progressing with marked improvements in guide RNA selection, protein and guide engineering, novel enzymes, and off-target detection methods. We review important challenges and breakthroughs in the field as a comprehensive practical guide to interested users of genome editing technologies, highlighting key tools and strategies for optimizing specificity. The genome editing community should now strive to standardize such methods for measuring and reporting off-target activity, while keeping in mind that the goal for specificity should be continued improvement and vigilance. PMID:27494557

  11. Evaluation of the acceptability of a CD-Rom as a health promotion tool for Inuit in Ottawa.

    PubMed

    McShane, Kelly E; Smylie, Janet K; Hastings, Paul D; Prince, Conrad; Siedule, Connie

    2013-01-01

    There are few health promotion tools for urban Inuit, and there is a specific dearth of evaluations on such tools. The current study used a community-specific approach in the evaluation of a health promotion tool, based on an urban Inuit community's preferences of health knowledge sources and distribution strategies. In partnership with the Tungasuvvingat Inuit Family Health Team in Ottawa, a CD-Rom was developed featuring an Inuk Elder presenting prenatal health messages in both Inuktitut and English. Also, relevant evaluation materials were developed. Using a mixed methods approach, 40 participants completed interviews prior to viewing the CD-Rom and participated in a focus group at follow-up. Questionnaires were also completed pre- and post-viewing to assess changes between expectations and reactions in order to document acceptability. Significant increases were found on satisfaction, acceptability of medium and relevance of content ratings. Qualitative findings also included (a) interest, uncertainty and conditional interest prior to viewing; and (b) positive evaluations of the CD-Rom. This suggests that CD-Rom technology has the potential for health promotion for urban Inuit, and the community-specific evaluation approach yielded useful information.

  12. Evaluation of the acceptability of a CD-Rom as a health promotion tool for Inuit in Ottawa

    PubMed Central

    McShane, Kelly E.; Smylie, Janet K.; Hastings, Paul D.; Prince, Conrad; Siedule, Connie

    2013-01-01

    Background There are few health promotion tools for urban Inuit, and there is a specific dearth of evaluations on such tools. Objective The current study used a community-specific approach in the evaluation of a health promotion tool, based on an urban Inuit community's preferences of health knowledge sources and distribution strategies. In partnership with the Tungasuvvingat Inuit Family Health Team in Ottawa, a CD-Rom was developed featuring an Inuk Elder presenting prenatal health messages in both Inuktitut and English. Also, relevant evaluation materials were developed. Design Using a mixed methods approach, 40 participants completed interviews prior to viewing the CD-Rom and participated in a focus group at follow-up. Questionnaires were also completed pre- and post-viewing to assess changes between expectations and reactions in order to document acceptability. Results Significant increases were found on satisfaction, acceptability of medium and relevance of content ratings. Qualitative findings also included (a) interest, uncertainty and conditional interest prior to viewing; and (b) positive evaluations of the CD-Rom. Conclusions This suggests that CD-Rom technology has the potential for health promotion for urban Inuit, and the community-specific evaluation approach yielded useful information. PMID:23717816

  13. Tools for the functional interpretation of metabolomic experiments.

    PubMed

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  14. Development of a screening tool to predict malnutrition among children under two years old in Zambia

    PubMed Central

    Hasegawa, Junko; Ito, Yoichi M; Yamauchi, Taro

    2017-01-01

    ABSTRACT Background: Maternal and child undernutrition is an important issue, particularly in low- and middle-income countries. Children at high risk of malnutrition should be prioritized to receive necessary interventions to minimize such risk. Several risk factors have been proposed; however, until now, there has been no appropriate evaluation method to identify these children. In sub-Saharan Africa, children commonly receive regular check-ups from community health workers. A simple and easy nutrition assessment method is therefore needed for use by semi-professional health workers. Objectives: The aim of this study was to develop and test a practical screening tool for community use in predicting growth stunting in children under two years in rural Zambia. Methods: Field research was conducted from July to August 2014 in Southern Province, Zambia. Two hundred and sixty-four mother-child pairs participated in the study. Anthropometric measurements were performed on all children and mothers, and all mothers were interviewed. Risk factors for the screening test were estimated by using least absolute shrinkage and selection operator analysis. After re-evaluating all participants using the new screening tool, a receiver operating characteristic curve was drawn to set the cut-off value. Sensitivity and specificity were also calculated. Results: The screening tool included age, weight-for-age Z-score status, birth weight, feeding status, history of sibling death, multiple birth, and maternal education level. The total score ranged from 0 to 22, and the cut-off value was eight. Sensitivity and specificity were 0.963 and 0.697 respectively. Conclusions: A screening tool was developed to predict children at high risk of malnutrition living in Zambia. Further longitudinal studies are needed to confirm the test’s validity in detecting future stunting and to investigate the effectiveness of malnutrition treatment. PMID:28730929

  15. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  16. SMOQ: a tool for predicting the absolute residue-specific quality of a single protein model with support vector machines

    PubMed Central

    2014-01-01

    Background It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. Results We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. Conclusion SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:24776231

  17. SMOQ: a tool for predicting the absolute residue-specific quality of a single protein model with support vector machines.

    PubMed

    Cao, Renzhi; Wang, Zheng; Wang, Yiheng; Cheng, Jianlin

    2014-04-28

    It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/.

  18. Antibody specific epitope prediction-emergence of a new paradigm.

    PubMed

    Sela-Culang, Inbal; Ofran, Yanay; Peters, Bjoern

    2015-04-01

    The development of accurate tools for predicting B-cell epitopes is important but difficult. Traditional methods have examined which regions in an antigen are likely binding sites of an antibody. However, it is becoming increasingly clear that most antigen surface residues will be able to bind one or more of the myriad of possible antibodies. In recent years, new approaches have emerged for predicting an epitope for a specific antibody, utilizing information encoded in antibody sequence or structure. Applying such antibody-specific predictions to groups of antibodies in combination with easily obtainable experimental data improves the performance of epitope predictions. We expect that further advances of such tools will be possible with the integration of immunoglobulin repertoire sequencing data. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Chapter 22: Compressed Air Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Benton, Nathanael; Burns, Patrick

    Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less

  20. Advancing data management and analysis in different scientific disciplines

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.

    2017-10-01

    Over the past several years, rapid growth of data has affected many fields of science. This has often resulted in the need for overhauling or exchanging the tools and approaches in the disciplines’ data life cycles. However, this allows the application of new data analysis methods and facilitates improved data sharing. The project Large-Scale Data Management and Analysis (LSDMA) of the German Helmholtz Association has been addressing both specific and generic requirements in its data life cycle successfully since 2012. Its data scientists work together with researchers from the fields such as climatology, energy and neuroscience to improve the community-specific data life cycles, in several cases even all stages of the data life cycle, i.e. from data acquisition to data archival. LSDMA scientists also study methods and tools that are of importance to many communities, e.g. data repositories and authentication and authorization infrastructure.

  1. [Imaging of breast tissues changes--early detection, screening and problem solving].

    PubMed

    Wruk, Daniela

    2008-04-01

    In the industrialised countries breast cancer is the cancer with the highest prevalence and causes the highest rate of cancer deaths among women. In Switzerland alone, about 5000 newly diagnosed cases occur per year. Our three main diagnostic tools in imaging diseases of the breast in the setting of screening, early detection or problem solving are mammography, ultrasound and MRI with intravenous contrast application. The most important imaging technique is mammography, which as only method has shown evidence to be suitable for screening so far. As a major accomplishing imaging tool there is sonography, which in women under 30 years of age is the first method of choice in examination of the breasts. The MRI is able to provide additional information about the perfusion of tissue changes within the breast; because of its low specificity, however, it should cautiously be applied for specific questions.

  2. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  3. Sensitivity and Specificity of Long Wave Infrared Imaging for Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Coben, Robert; Myers, Thomas E.

    2009-01-01

    Objective: This study was the first to investigate the efficacy of long wave infrared (LWIR) imaging as a diagnostic tool for ADHD. Method: with ADHD and a high level of specificity (94%) in discriminating those with ADHD from those with other diagnoses. The overall classification rate was 73.16%. This was indicative of a high level of…

  4. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  5. Novel Peptide Sequence (“IQ-tag”) with High Affinity for NIR Fluorochromes Allows Protein and Cell Specific Labeling for In Vivo Imaging

    PubMed Central

    McCarthy, Jason R.; Weissleder, Ralph

    2007-01-01

    Background Probes that allow site-specific protein labeling have become critical tools for visualizing biological processes. Methods Here we used phage display to identify a novel peptide sequence with nanomolar affinity for near infrared (NIR) (benz)indolium fluorochromes. The developed peptide sequence (“IQ-tag”) allows detection of NIR dyes in a wide range of assays including ELISA, flow cytometry, high throughput screens, microscopy, and optical in vivo imaging. Significance The described method is expected to have broad utility in numerous applications, namely site-specific protein imaging, target identification, cell tracking, and drug development. PMID:17653285

  6. The utility of clinical decision tools for diagnosing osteoporosis in postmenopausal women with rheumatoid arthritis

    PubMed Central

    Brand, Caroline; Lowe, Adrian; Hall, Stephen

    2008-01-01

    Background Patients with rheumatoid arthritis have a higher risk of low bone mineral density than normal age matched populations. There is limited evidence to support cost effectiveness of population screening in rheumatoid arthritis and case finding strategies have been proposed as a means to increase cost effectiveness of diagnostic screening for osteoporosis. This study aimed to assess the performance attributes of generic and rheumatoid arthritis specific clinical decision tools for diagnosing osteoporosis in a postmenopausal population with rheumatoid arthritis who attend ambulatory specialist rheumatology clinics. Methods A cross-sectional study of 127 ambulatory post-menopausal women with rheumatoid arthritis was performed. Patients currently receiving or who had previously received bone active therapy were excluded. Eligible women underwent clinical assessment and dual-energy-xray absorptiometry (DXA) bone mineral density assessment. Clinical decision tools, including those specific for rheumatoid arthritis, were compared to seven generic post-menopausal tools to predict osteoporosis (defined as T score < -2.5). Sensitivity, specificity, positive predictive and negative predictive values and area under the curve were assessed. The diagnostic attributes of the clinical decision tools were compared by examination of the area under the receiver-operator-curve. Results One hundred and twenty seven women participated. The median age was 62 (IQR 56–71) years. Median disease duration was 108 (60–168) months. Seventy two (57%) women had no record of a previous DXA examination. Eighty (63%) women had T scores at femoral neck or lumbar spine less than -1. The area under the ROC curve for clinical decision tool prediction of T score <-2.5 varied between 0.63 and 0.76. The rheumatoid arthritis specific decision tools did not perform better than generic tools, however, the National Osteoporosis Foundation score could potentially reduce the number of unnecessary DXA tests by approximately 45% in this population. Conclusion There was limited utility of clinical decision tools for predicting osteoporosis in this patient population. Fracture prediction tools that include risk factors independent of BMD are needed. PMID:18230132

  7. Tissue enrichment analysis for C. elegans genomics.

    PubMed

    Angeles-Albores, David; N Lee, Raymond Y; Chan, Juancarlos; Sternberg, Paul W

    2016-09-13

    Over the last ten years, there has been explosive development in methods for measuring gene expression. These methods can identify thousands of genes altered between conditions, but understanding these datasets and forming hypotheses based on them remains challenging. One way to analyze these datasets is to associate ontologies (hierarchical, descriptive vocabularies with controlled relations between terms) with genes and to look for enrichment of specific terms. Although Gene Ontology (GO) is available for Caenorhabditis elegans, it does not include anatomical information. We have developed a tool for identifying enrichment of C. elegans tissues among gene sets and generated a website GUI where users can access this tool. Since a common drawback to ontology enrichment analyses is its verbosity, we developed a very simple filtering algorithm to reduce the ontology size by an order of magnitude. We adjusted these filters and validated our tool using a set of 30 gold standards from Expression Cluster data in WormBase. We show our tool can even discriminate between embryonic and larval tissues and can even identify tissues down to the single-cell level. We used our tool to identify multiple neuronal tissues that are down-regulated due to pathogen infection in C. elegans. Our Tissue Enrichment Analysis (TEA) can be found within WormBase, and can be downloaded using Python's standard pip installer. It tests a slimmed-down C. elegans tissue ontology for enrichment of specific terms and provides users with a text and graphic representation of the results.

  8. [Short interspersed repetitive sequences (SINEs) and their use as a phylogenetic tool].

    PubMed

    Kramerov, D A; Vasetskiĭ, N S

    2009-01-01

    The data on one of the most common repetitive elements of eukaryotic genomes, short interspersed elements (SINEs), are reviewed. Their structure, origin, and functioning in the genome are discussed. The variation and abundance of these neutral genomic markers makes them a convenient and reliable tool for phylogenetic analysis. The main methods of such analysis are presented, and the potential and limitations of this approach are discussed using specific examples.

  9. Validation of Caregiver-Centered Delirium Detection Tools: A Systematic Review.

    PubMed

    Rosgen, Brianna; Krewulak, Karla; Demiantschuk, Danielle; Ely, E Wesley; Davidson, Judy E; Stelfox, Henry T; Fiest, Kirsten M

    2018-04-18

    To summarize the validity of caregiver-centered delirium detection tools in hospitalized adults and assess associated patient and caregiver outcomes. Systematic review. We searched MEDLINE, EMBASE, PsycINFO, CINAHL, and Scopus from inception to May 15, 2017. Hospitalized adults. Caregiver-centered delirium detection tools. We drafted a protocol from the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Two reviewers independently completed abstract and full-text review, data extraction, and quality assessment. We summarized findings using descriptive statistics including mean, median, standard deviation, range, frequencies (percentages), and Cohen's kappa. Studies that reported on the validity of caregiver-centered delirium detection tools or associated patient and caregiver outcomes and were cohort or cross-sectional in design were included. We reviewed 6,056 titles and abstracts, included 6 articles, and identified 6 caregiver-centered tools. All tools were designed to be administered in several minutes or less and had 11 items or fewer. Three tools were caregiver administered (completed independently by caregivers): Family Confusion Assessment Method (FAM-CAM), Informant Assessment of Geriatric Delirium (I-AGeD), and Sour Seven. Three tools were caregiver informed (administered by a healthcare professional using caregiver input): Single Question in Delirium (SQiD), Single Screening Question Delirium (SSQ-Delirium), and Stressful Caregiving Response to Experiences of Dying. Caregiver-administered tools had better psychometric properties (FAM-CAM sensitivity 75%, 95% confidence interval (CI)=35-95%, specificity 91%, 95% CI=74-97%; Sour Seven positive predictive value 89.5%, negative predictive value 90%) than caregiver-informed tools (SQiD: sensitivity 80%, 95% CI=28.4-99.5%; specificity 71%, 95% CI=41.9-91.6%; SSQ-Delirium sensitivity 79.6%, specificity 56.1%). Delirium detection is essential for appropriate delirium management. Caregiver-centered delirium detection tools show promise in improving delirium detection and associated patient and caregiver outcomes. Comparative studies using larger sample sizes and multiple centers are required to determine validity and reliability characteristics. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.

  10. Environmental impact assessment for alternative-energy power plants in México.

    PubMed

    González-Avila, María E; Beltrán-Morales, Luis Felipe; Braker, Elizabeth; Ortega-Rubio, Alfredo

    2006-07-01

    Ten Environmental Impact Assessment Reports (EIAR) were reviewed for projects involving alternative power plants in Mexico developed during the last twelve years. Our analysis focused on the methods used to assess the impacts produced by hydroelectric and geothermal power projects. These methods used to assess impacts in EIARs ranged from the most simple, descriptive criteria, to quantitative models. These methods are not concordant with the level of the EIAR required by the environmental authority or even, with the kind of project developed. It is concluded that there is no correlation between the tools used to assess impacts and the assigned type of the EIAR. Because the methods to assess impacts produced by these power projects have not changed during 2000 years, we propose a quantitative method, based on ecological criteria and tools, to assess the impacts produced by hydroelectric and geothermal plants, according to the specific characteristics of the project. The proposed method is supported by environmental norms, and can assist environmental authorities in assigning the correct level and tools to be applied to hydroelectric and geothermal projects. The proposed method can be adapted to other production activities in Mexico and to other countries.

  11. A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…

  12. A Guide to Analyzing Message-Response Sequences and Group Interaction Patterns in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Jeong, Allan

    2005-01-01

    This paper proposes a set of methods and a framework for evaluating, modeling, and predicting group interactions in computer-mediated communication. The method of sequential analysis is described along with specific software tools and techniques to facilitate the analysis of message-response sequences. In addition, the Dialogic Theory and its…

  13. Methodology for Planning Technical Education: With a Case Study of Polytechnics in Bangladesh.

    ERIC Educational Resources Information Center

    Ritzen, Jozef M.; Balderston, Judith B.

    A product of research first begun by one of the authors in Bangladesh, this book develops a comprehensive set of methods for planning technical education. Wherever possible, the authors draw on existing tools, fitting them to the specific context of technical education. When faced with planning problems for which existing methods are ill suited…

  14. A systematic review and synthesis of the strengths and limitations of measuring malaria mortality through verbal autopsy.

    PubMed

    Herrera, Samantha; Enuameh, Yeetey; Adjei, George; Ae-Ngibise, Kenneth Ayuurebobi; Asante, Kwaku Poku; Sankoh, Osman; Owusu-Agyei, Seth; Yé, Yazoume

    2017-10-23

    Lack of valid and reliable data on malaria deaths continues to be a problem that plagues the global health community. To address this gap, the verbal autopsy (VA) method was developed to ascertain cause of death at the population level. Despite the adoption and wide use of VA, there are many recognized limitations of VA tools and methods, especially for measuring malaria mortality. This study synthesizes the strengths and limitations of existing VA tools and methods for measuring malaria mortality (MM) in low- and middle-income countries through a systematic literature review. The authors searched PubMed, Cochrane Library, Popline, WHOLIS, Google Scholar, and INDEPTH Network Health and Demographic Surveillance System sites' websites from 1 January 1990 to 15 January 2016 for articles and reports on MM measurement through VA. article presented results from a VA study where malaria was a cause of death; article discussed limitations/challenges related to measurement of MM through VA. Two authors independently searched the databases and websites and conducted a synthesis of articles using a standard matrix. The authors identified 828 publications; 88 were included in the final review. Most publications were VA studies; others were systematic reviews discussing VA tools or methods; editorials or commentaries; and studies using VA data to develop MM estimates. The main limitation were low sensitivity and specificity of VA tools for measuring MM. Other limitations included lack of standardized VA tools and methods, lack of a 'true' gold standard to assess accuracy of VA malaria mortality. Existing VA tools and methods for measuring MM have limitations. Given the need for data to measure progress toward the World Health Organization's Global Technical Strategy for Malaria 2016-2030 goals, the malaria community should define strategies for improving MM estimates, including exploring whether VA tools and methods could be further improved. Longer term strategies should focus on improving countries' vital registration systems for more robust and timely cause of death data.

  15. Cosine Kuramoto Based Distribution of a Convoy with Limit-Cycle Obstacle Avoidance Through the Use of Simulated Agents

    NASA Astrophysics Data System (ADS)

    Howerton, William

    This thesis presents a method for the integration of complex network control algorithms with localized agent specific algorithms for maneuvering and obstacle avoidance. This method allows for successful implementation of group and agent specific behaviors. It has proven to be robust and will work for a variety of vehicle platforms. Initially, a review and implementation of two specific algorithms will be detailed. The first, a modified Kuramoto model was developed by Xu [1] which utilizes tools from graph theory to efficiently perform the task of distributing agents. The second algorithm developed by Kim [2] is an effective method for wheeled robots to avoid local obstacles using a limit-cycle navigation method. The results of implementing these methods on a test-bed of wheeled robots will be presented. Control issues related to outside disturbances not anticipated in the original theory are then discussed. A novel method of using simulated agents to separate the task of distributing agents from agent specific velocity and heading commands has been developed and implemented to address these issues. This new method can be used to combine various behaviors and is not limited to a specific control algorithm.

  16. Enhanced Imaging of Specific Cell-Surface Glycosylation Based on Multi-FRET.

    PubMed

    Yuan, Baoyin; Chen, Yuanyuan; Sun, Yuqiong; Guo, Qiuping; Huang, Jin; Liu, Jianbo; Meng, Xiangxian; Yang, Xiaohai; Wen, Xiaohong; Li, Zenghui; Li, Lie; Wang, Kemin

    2018-05-15

    Cell-surface glycosylation contains abundant biological information that reflects cell physiological state, and it is of great value to image cell-surface glycosylation to elucidate its functions. Here we present a hybridization chain reaction (HCR)-based multifluorescence resonance energy transfer (multi-FRET) method for specific imaging of cell-surface glycosylation. By installing donors through metabolic glycan labeling and acceptors through aptamer-tethered nanoassemblies on the same glycoconjugate, intramolecular multi-FRET occurs due to near donor-acceptor distance. Benefiting from amplified effect and spatial flexibility of the HCR nanoassemblies, enhanced multi-FRET imaging of specific cell-surface glycosylation can be obtained. With this HCR-based multi-FRET method, we achieved obvious contrast in imaging of protein-specific GalNAcylation on 7211 cell surfaces. In addition, we demonstrated the general applicability of this method by visualizing the protein-specific sialylation on CEM cell surfaces. Furthermore, the expression changes of CEM cell-surface protein-specific sialylation under drug treatment was accurately monitored. This developed imaging method may provide a powerful tool in researching glycosylation functions, discovering biomarkers, and screening drugs.

  17. An accuracy evaluation of clinical, arthrometric, and stress-sonographic acute ankle instability examinations.

    PubMed

    Wiebking, Ulrich; Pacha, Tarek Omar; Jagodzinski, Michael

    2015-03-01

    Ankle sprain injuries, often due to lateral ligamentous injury, are the most common sports traumatology conditions. Correct diagnoses require an understanding of the assessment tools with a high degree of diagnostic accuracy. Obviously, there are still no clear consensuses or standard methods to differentiate between a ligament tear and an ankle sprain. In addition to clinical assessments, stress sonography, arthrometer and other methods are often performed simultaneously. These methods are often costly, however, and their accuracy is controversial. The aim of this study was to investigate three different measurement tools that can be used after a lateral ligament lesion of the ankle with injury of the anterior talofibular ligament to determine their diagnostic accuracy. Thirty patients were recruited for this study. The mean patient age was 35±14 years. There were 15 patients with a ligamentous rupture and 15 patients with an ankle sprain. We quantified two devices and one clinical assessment by which we calculated the sensitivity and specifity: Stress sonography according to Hoffmann, an arthrometer to investigate the 100N talar drawer and maximum manual testing and the clinical assessment of the anterior drawer test. A high resolution sonography was used as the gold standard. The ultrasound-assisted gadgetry according to Hoffmann, with a 3mm cut-off value, displayed a sensitivity of 0.27 and a specificity of 0.87. Using a 3.95mm cut-off value, the arthrometer displayed a sensitivity of 0.8 and a specificity of 0.4. The clinical investigation sensitivities and specificities were 0.93 and 0.67, respectively. Different assessment methods for ankle rupture diagnoses are suggested in the literature; however, these methods lack reliable data to set investigation standards. Clinical examination under adequate analgesia seems to remains the most reliable tool to investigate ligamentous ankle lesions. Further clinical studies with higher case numbers are necessary, however, to evaluate these findings and to measure the reliability. Copyright © 2014 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.

  18. SITE-SPECIFIC DIAGNOSTIC TOOLS

    EPA Science Inventory

    US EPA's Office of Water is proposing Combined Assessment and Listing Methods (CALM) to
    meet reporting requirements under both Sections 305b and 303d for chemical and nonchemical
    stressors in the nation's waterbodies. Current Environmental Monitoring and Assessment
    Prog...

  19. Sensitivity and specificity of the Eating Assessment Tool and the Volume-Viscosity Swallow Test for clinical evaluation of oropharyngeal dysphagia

    PubMed Central

    Rofes, L; Arreola, V; Mukherjee, R; Clavé, P

    2014-01-01

    Background Oropharyngeal dysphagia (OD) is an underdiagnosed digestive disorder that causes severe nutritional and respiratory complications. Our aim was to determine the accuracy of the Eating Assessment Tool (EAT-10) and the Volume-Viscosity Swallow Test (V-VST) for clinical evaluation of OD. Methods We studied 120 patients with swallowing difficulties and 14 healthy subjects. OD was evaluated by the 10-item screening questionnaire EAT-10 and the bedside method V-VST, videofluoroscopy (VFS) being the reference standard. The V-VST is an effort test that uses boluses of different volumes and viscosities to identify clinical signs of impaired efficacy (impaired labial seal, piecemeal deglutition, and residue) and impaired safety of swallow (cough, voice changes, and oxygen desaturation ≥3%). Discriminating ability was assessed by the AUC of the ROC curve and sensitivity and specificity values. Key Results According to VFS, prevalence of OD was 87%, 75.6% with impaired efficacy and 80.9% with impaired safety of swallow including 17.6% aspirations. The EAT-10 showed a ROC AUC of 0.89 for OD with an optimal cut-off at 2 (0.89 sensitivity and 0.82 specificity). The V-VST showed 0.94 sensitivity and 0.88 specificity for OD, 0.79 sensitivity and 0.75 specificity for impaired efficacy, 0.87 sensitivity and 0.81 specificity for impaired safety, and 0.91 sensitivity and 0.28 specificity for aspirations. Conclusions & Inferences Clinical methods for screening (EAT-10) and assessment (V-VST) of OD offer excellent psychometric proprieties that allow adequate management of OD. Their universal application among at-risk populations will improve the identification of patients with OD at risk for malnutrition and aspiration pneumonia. PMID:24909661

  20. Improving CRISPR-Cas specificity with chemical modifications in single-guide RNAs.

    PubMed

    Ryan, Daniel E; Taussig, David; Steinfeld, Israel; Phadnis, Smruti M; Lunstad, Benjamin D; Singh, Madhurima; Vuong, Xuan; Okochi, Kenji D; McCaffrey, Ryan; Olesiak, Magdalena; Roy, Subhadeep; Yung, Chong Wing; Curry, Bo; Sampson, Jeffrey R; Bruhn, Laurakay; Dellinger, Douglas J

    2018-01-25

    CRISPR systems have emerged as transformative tools for altering genomes in living cells with unprecedented ease, inspiring keen interest in increasing their specificity for perfectly matched targets. We have developed a novel approach for improving specificity by incorporating chemical modifications in guide RNAs (gRNAs) at specific sites in their DNA recognition sequence ('guide sequence') and systematically evaluating their on-target and off-target activities in biochemical DNA cleavage assays and cell-based assays. Our results show that a chemical modification (2'-O-methyl-3'-phosphonoacetate, or 'MP') incorporated at select sites in the ribose-phosphate backbone of gRNAs can dramatically reduce off-target cleavage activities while maintaining high on-target performance, as demonstrated in clinically relevant genes. These findings reveal a unique method for enhancing specificity by chemically modifying the guide sequence in gRNAs. Our approach introduces a versatile tool for augmenting the performance of CRISPR systems for research, industrial and therapeutic applications. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Improving CRISPR–Cas specificity with chemical modifications in single-guide RNAs

    PubMed Central

    Ryan, Daniel E; Taussig, David; Steinfeld, Israel; Phadnis, Smruti M; Lunstad, Benjamin D; Singh, Madhurima; Vuong, Xuan; Okochi, Kenji D; McCaffrey, Ryan; Olesiak, Magdalena; Roy, Subhadeep; Yung, Chong Wing; Curry, Bo; Sampson, Jeffrey R; Dellinger, Douglas J

    2018-01-01

    Abstract CRISPR systems have emerged as transformative tools for altering genomes in living cells with unprecedented ease, inspiring keen interest in increasing their specificity for perfectly matched targets. We have developed a novel approach for improving specificity by incorporating chemical modifications in guide RNAs (gRNAs) at specific sites in their DNA recognition sequence (‘guide sequence’) and systematically evaluating their on-target and off-target activities in biochemical DNA cleavage assays and cell-based assays. Our results show that a chemical modification (2′-O-methyl-3′-phosphonoacetate, or ‘MP’) incorporated at select sites in the ribose-phosphate backbone of gRNAs can dramatically reduce off-target cleavage activities while maintaining high on-target performance, as demonstrated in clinically relevant genes. These findings reveal a unique method for enhancing specificity by chemically modifying the guide sequence in gRNAs. Our approach introduces a versatile tool for augmenting the performance of CRISPR systems for research, industrial and therapeutic applications. PMID:29216382

  2. GIANT API: an application programming interface for functional genomics

    PubMed Central

    Roberts, Andrew M.; Wong, Aaron K.; Fisk, Ian; Troyanskaya, Olga G.

    2016-01-01

    GIANT API provides biomedical researchers programmatic access to tissue-specific and global networks in humans and model organisms, and associated tools, which includes functional re-prioritization of existing genome-wide association study (GWAS) data. Using tissue-specific interaction networks, researchers are able to predict relationships between genes specific to a tissue or cell lineage, identify the changing roles of genes across tissues and uncover disease-gene associations. Additionally, GIANT API enables computational tools like NetWAS, which leverages tissue-specific networks for re-prioritization of GWAS results. The web services covered by the API include 144 tissue-specific functional gene networks in human, global functional networks for human and six common model organisms and the NetWAS method. GIANT API conforms to the REST architecture, which makes it stateless, cacheable and highly scalable. It can be used by a diverse range of clients including web browsers, command terminals, programming languages and standalone apps for data analysis and visualization. The API is freely available for use at http://giant-api.princeton.edu. PMID:27098035

  3. What are the most effective methods for assessment of nutritional status in outpatients with gastric and colorectal cancer?

    PubMed

    Abe Vicente, Mariana; Barão, Katia; Silva, Tiago Donizetti; Forones, Nora Manoukian

    2013-01-01

    To evaluate methods for the identification of nutrition risk and nutritional status in outpatients with colorectal (CRC) and gastric cancer (GC), and to compare the results to those obtained for patients already treated for these cancers. A cross-sectional study was conducted on 137 patients: group 1 (n = 75) consisting of patients with GC or CRC, and group 2 (n = 62) consisting of patients after treatment of GC or CRC under follow up, who were tumor free for a period longer than 3 months. Nutritional status was assessed in these patients using objective methods [body mass index (BMI), phase angle, serum albumin]; nutritional screening tools [Malnutrition Universal Screening Tool (MUST), Malnutrition Screening Tool (MST), Nutritional Risk Index (NRI)], and subjective assessment [Patient-Generated Subjective Global Assessment (PGSGA)]. The sensitivity and specificity of each method was calculated in relation to the PG-SGA used as gold standard. One hundred thirty seven patients participated in the study. Stage IV cancer patients were more common in group 1. There was no difference in BMI between groups (p = 0.67). Analysis of the association between methods of assessing nutritional status and PG-SGA showed that the nutritional screening tools provided more significant results (p < 0.05) than the objective methods in the two groups. PG-SGA detected the highest proportion of undernourished patients in group 1. The nutritional screening tools MUST, NRI and MST were more sensitive than the objective methods. Phase angle measurement was the most sensitive objective method in group 1. The nutritional screening tools showed the best association with PG-SGA and were also more sensitive than the objective methods. The results suggest the combination of MUST and PG-SGA for patients with cancer before and after treatment. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  4. CardioClassifier: disease- and gene-specific computational decision support for clinical genome interpretation.

    PubMed

    Whiffin, Nicola; Walsh, Roddy; Govind, Risha; Edwards, Matthew; Ahmad, Mian; Zhang, Xiaolei; Tayal, Upasana; Buchan, Rachel; Midwinter, William; Wilk, Alicja E; Najgebauer, Hanna; Francis, Catherine; Wilkinson, Sam; Monk, Thomas; Brett, Laura; O'Regan, Declan P; Prasad, Sanjay K; Morris-Rosendahl, Deborah J; Barton, Paul J R; Edwards, Elizabeth; Ware, James S; Cook, Stuart A

    2018-01-25

    PurposeInternationally adopted variant interpretation guidelines from the American College of Medical Genetics and Genomics (ACMG) are generic and require disease-specific refinement. Here we developed CardioClassifier (http://www.cardioclassifier.org), a semiautomated decision-support tool for inherited cardiac conditions (ICCs).MethodsCardioClassifier integrates data retrieved from multiple sources with user-input case-specific information, through an interactive interface, to support variant interpretation. Combining disease- and gene-specific knowledge with variant observations in large cohorts of cases and controls, we refined 14 computational ACMG criteria and created three ICC-specific rules.ResultsWe benchmarked CardioClassifier on 57 expertly curated variants and show full retrieval of all computational data, concordantly activating 87.3% of rules. A generic annotation tool identified fewer than half as many clinically actionable variants (64/219 vs. 156/219, Fisher's P = 1.1  ×  10 -18 ), with important false positives, illustrating the critical importance of disease and gene-specific annotations. CardioClassifier identified putatively disease-causing variants in 33.7% of 327 cardiomyopathy cases, comparable with leading ICC laboratories. Through addition of manually curated data, variants found in over 40% of cardiomyopathy cases are fully annotated, without requiring additional user-input data.ConclusionCardioClassifier is an ICC-specific decision-support tool that integrates expertly curated computational annotations with case-specific data to generate fast, reproducible, and interactive variant pathogenicity reports, according to best practice guidelines.GENETICS in MEDICINE advance online publication, 25 January 2018; doi:10.1038/gim.2017.258.

  5. Aquatic ecosystem protection and restoration: Advances in methods for assessment and evaluation

    USGS Publications Warehouse

    Bain, M.B.; Harig, A.L.; Loucks, D.P.; Goforth, R.R.; Mills, K.E.

    2000-01-01

    Many methods and criteria are available to assess aquatic ecosystems, and this review focuses on a set that demonstrates advancements from community analyses to methods spanning large spatial and temporal scales. Basic methods have been extended by incorporating taxa sensitivity to different forms of stress, adding measures linked to system function, synthesizing multiple faunal groups, integrating biological and physical attributes, spanning large spatial scales, and enabling simulations through time. These tools can be customized to meet the needs of a particular assessment and ecosystem. Two case studies are presented to show how new methods were applied at the ecosystem scale for achieving practical management goals. One case used an assessment of biotic structure to demonstrate how enhanced river flows can improve habitat conditions and restore a diverse fish fauna reflective of a healthy riverine ecosystem. In the second case, multitaxonomic integrity indicators were successful in distinguishing lake ecosystems that were disturbed, healthy, and in the process of restoration. Most methods strive to address the concept of biological integrity and assessment effectiveness often can be impeded by the lack of more specific ecosystem management objectives. Scientific and policy explorations are needed to define new ways for designating a healthy system so as to allow specification of precise quality criteria that will promote further development of ecosystem analysis tools.

  6. Note: A phase synchronization photography method for AC discharge.

    PubMed

    Wu, Zhicheng; Zhang, Qiaogen; Ma, Jingtan; Pang, Lei

    2018-05-01

    To research discharge physics under AC voltage, a phase synchronization photography method is presented. By using a permanent-magnet synchronous motor to drive a photography mask synchronized with a discharge power supply, discharge images in a specific phase window can be recorded. Some examples of discharges photographed by this method, including the corona discharge in SF 6 and the corona discharge along the air/epoxy surface, demonstrate the feasibility of this method. Therefore, this method provides an effective tool for discharge physics researchers.

  7. Note: A phase synchronization photography method for AC discharge

    NASA Astrophysics Data System (ADS)

    Wu, Zhicheng; Zhang, Qiaogen; Ma, Jingtan; Pang, Lei

    2018-05-01

    To research discharge physics under AC voltage, a phase synchronization photography method is presented. By using a permanent-magnet synchronous motor to drive a photography mask synchronized with a discharge power supply, discharge images in a specific phase window can be recorded. Some examples of discharges photographed by this method, including the corona discharge in SF6 and the corona discharge along the air/epoxy surface, demonstrate the feasibility of this method. Therefore, this method provides an effective tool for discharge physics researchers.

  8. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  9. Automatic 3D segmentation of multiphoton images: a key step for the quantification of human skin.

    PubMed

    Decencière, Etienne; Tancrède-Bohin, Emmanuelle; Dokládal, Petr; Koudoro, Serge; Pena, Ana-Maria; Baldeweck, Thérèse

    2013-05-01

    Multiphoton microscopy has emerged in the past decade as a useful noninvasive imaging technique for in vivo human skin characterization. However, it has not been used until now in evaluation clinical trials, mainly because of the lack of specific image processing tools that would allow the investigator to extract pertinent quantitative three-dimensional (3D) information from the different skin components. We propose a 3D automatic segmentation method of multiphoton images which is a key step for epidermis and dermis quantification. This method, based on the morphological watershed and graph cuts algorithms, takes into account the real shape of the skin surface and of the dermal-epidermal junction, and allows separating in 3D the epidermis and the superficial dermis. The automatic segmentation method and the associated quantitative measurements have been developed and validated on a clinical database designed for aging characterization. The segmentation achieves its goals for epidermis-dermis separation and allows quantitative measurements inside the different skin compartments with sufficient relevance. This study shows that multiphoton microscopy associated with specific image processing tools provides access to new quantitative measurements on the various skin components. The proposed 3D automatic segmentation method will contribute to build a powerful tool for characterizing human skin condition. To our knowledge, this is the first 3D approach to the segmentation and quantification of these original images. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  10. Applying anthropology to eliminate tobacco-related health disparities.

    PubMed

    Goldade, Kate; Burgess, Diana; Olayinka, Abimbola; Whembolua, Guy Lucien S; Okuyemi, Kolawole S

    2012-06-01

    Disparities in tobacco's harm persist. Declines in smoking among the general population have not been experienced to the same extent by vulnerable populations. Innovative strategies are required to diminish disparities in tobacco's harm. As novel tools, anthropological concepts and methods may be applied to improve the design and outcomes of tobacco cessation interventions. We reviewed over 60 articles published in peer-reviewed journals since 1995 for content on anthropology and smoking cessation. The specific questions framing the review were: (a) "How can lessons learned from anthropological studies of smoking improve the design and effectiveness of smoking cessation interventions?" (b) How can anthropology be applied to diminish disparities in smoking cessation? and (c) How can qualitative methods be used most effectively in smoking cessation intervention research? Three specific disciplinary tools were identified and examined: (a) culture, (b) reflexivity, and (c) qualitative methods. Examining culture as a dynamic influence and understanding the utilities of smoking in a particular group is a precursor to promoting cessation. Reflexivity enables a deeper understanding of how smokers perceive quitting and smoking beyond addiction and individual health consequences. Qualitative methods may be used to elicit in-depth perspectives on quitting, insights to inform existing community-based strategies for making behavior changes, and detailed preferences for cessation treatment or programs. Anthropological tools can be used to improve the effectiveness of intervention research studies targeting individuals from vulnerable groups. Synthesized applications of anthropological concepts can be used to facilitate translation of findings into clinical practice for providers addressing tobacco cessation in vulnerable populations.

  11. Computational challenges, tools and resources for analyzing co- and post-transcriptional events in high throughput

    PubMed Central

    Bahrami-Samani, Emad; Vo, Dat T.; de Araujo, Patricia Rosa; Vogel, Christine; Smith, Andrew D.; Penalva, Luiz O. F.; Uren, Philip J.

    2014-01-01

    Co- and post-transcriptional regulation of gene expression is complex and multi-faceted, spanning the complete RNA lifecycle from genesis to decay. High-throughput profiling of the constituent events and processes is achieved through a range of technologies that continue to expand and evolve. Fully leveraging the resulting data is non-trivial, and requires the use of computational methods and tools carefully crafted for specific data sources and often intended to probe particular biological processes. Drawing upon databases of information pre-compiled by other researchers can further elevate analyses. Within this review, we describe the major co- and post-transcriptional events in the RNA lifecycle that are amenable to high-throughput profiling. We place specific emphasis on the analysis of the resulting data, in particular the computational tools and resources available, as well as looking towards future challenges that remain to be addressed. PMID:25515586

  12. Applying Quality Function Deployment Model in Burn Unit Service Improvement.

    PubMed

    Keshtkaran, Ali; Hashemi, Neda; Kharazmi, Erfan; Abbasi, Mehdi

    2016-01-01

    Quality function deployment (QFD) is one of the most effective quality design tools. This study applies QFD technique to improve the quality of the burn unit services in Ghotbedin Hospital in Shiraz, Iran. First, the patients' expectations of burn unit services and their priorities were determined through Delphi method. Thereafter, burn unit service specifications were determined through Delphi method. Further, the relationships between the patients' expectations and service specifications and also the relationships between service specifications were determined through an expert group's opinion. Last, the final importance scores of service specifications were calculated through simple additive weighting method. The findings show that burn unit patients have 40 expectations in six different areas. These expectations are in 16 priority levels. Burn units also have 45 service specifications in six different areas. There are four-level relationships between the patients' expectations and service specifications and four-level relationships between service specifications. The most important burn unit service specifications have been identified in this study. The QFD model developed in the study can be a general guideline for QFD planners and executives.

  13. The Objective Identification and Quantification of Interstitial Lung Abnormalities in Smokers.

    PubMed

    Ash, Samuel Y; Harmouche, Rola; Ross, James C; Diaz, Alejandro A; Hunninghake, Gary M; Putman, Rachel K; Onieva, Jorge; Martinez, Fernando J; Choi, Augustine M; Lynch, David A; Hatabu, Hiroto; Rosas, Ivan O; Estepar, Raul San Jose; Washko, George R

    2017-08-01

    Previous investigation suggests that visually detected interstitial changes in the lung parenchyma of smokers are highly clinically relevant and predict outcomes, including death. Visual subjective analysis to detect these changes is time-consuming, insensitive to subtle changes, and requires training to enhance reproducibility. Objective detection of such changes could provide a method of disease identification without these limitations. The goal of this study was to develop and test a fully automated image processing tool to objectively identify radiographic features associated with interstitial abnormalities in the computed tomography scans of a large cohort of smokers. An automated tool that uses local histogram analysis combined with distance from the pleural surface was used to detect radiographic features consistent with interstitial lung abnormalities in computed tomography scans from 2257 individuals from the Genetic Epidemiology of COPD study, a longitudinal observational study of smokers. The sensitivity and specificity of this tool was determined based on its ability to detect the visually identified presence of these abnormalities. The tool had a sensitivity of 87.8% and a specificity of 57.5% for the detection of interstitial lung abnormalities, with a c-statistic of 0.82, and was 100% sensitive and 56.7% specific for the detection of the visual subtype of interstitial abnormalities called fibrotic parenchymal abnormalities, with a c-statistic of 0.89. In smokers, a fully automated image processing tool is able to identify those individuals who have interstitial lung abnormalities with moderate sensitivity and specificity. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  14. Recombinase-Mediated Cassette Exchange Using Adenoviral Vectors.

    PubMed

    Kolb, Andreas F; Knowles, Christopher; Pultinevicius, Patrikas; Harbottle, Jennifer A; Petrie, Linda; Robinson, Claire; Sorrell, David A

    2017-01-01

    Site-specific recombinases are important tools for the modification of mammalian genomes. In conjunction with viral vectors, they can be utilized to mediate site-specific gene insertions in animals and in cell lines which are difficult to transfect. Here we describe a method for the generation and analysis of an adenovirus vector supporting a recombinase-mediated cassette exchange reaction and discuss the advantages and limitations of this approach.

  15. Environmental DNA as a Tool for Inventory and Monitoring of Aquatic Vertebrates

    DTIC Science & Technology

    2017-07-01

    geomorphic calculations and description of each reach. Methods Channel Surveys We initially selected reaches based on access and visual indicators...WA 99164 I-2 Environmental DNA lab protocol: designing species-specific qPCR assays Species-specific surveys should use quantitative polymerase...to traditional field sampling with respect to sensitivity, detection probabilities, and cost efficiency. Compared to field surveys , eDNA sampling

  16. Applying Anthropology to Eliminate Tobacco-Related Health Disparities

    PubMed Central

    Burgess, Diana; Olayinka, Abimbola; Whembolua, Guy Lucien S.; Okuyemi, Kolawole S.

    2012-01-01

    Introduction: Disparities in tobacco’s harm persist. Declines in smoking among the general population have not been experienced to the same extent by vulnerable populations. Innovative strategies are required to diminish disparities in tobacco’s harm. As novel tools, anthropological concepts and methods may be applied to improve the design and outcomes of tobacco cessation interventions. Methods: We reviewed over 60 articles published in peer-reviewed journals since 1995 for content on anthropology and smoking cessation. The specific questions framing the review were: (a) “How can lessons learned from anthropological studies of smoking improve the design and effectiveness of smoking cessation interventions?” (b) How can anthropology be applied to diminish disparities in smoking cessation? and (c) How can qualitative methods be used most effectively in smoking cessation intervention research? Results: Three specific disciplinary tools were identified and examined: (a) culture, (b) reflexivity, and (c) qualitative methods. Examining culture as a dynamic influence and understanding the utilities of smoking in a particular group is a precursor to promoting cessation. Reflexivity enables a deeper understanding of how smokers perceive quitting and smoking beyond addiction and individual health consequences. Qualitative methods may be used to elicit in-depth perspectives on quitting, insights to inform existing community-based strategies for making behavior changes, and detailed preferences for cessation treatment or programs. Conclusions: Anthropological tools can be used to improve the effectiveness of intervention research studies targeting individuals from vulnerable groups. Synthesized applications of anthropological concepts can be used to facilitate translation of findings into clinical practice for providers addressing tobacco cessation in vulnerable populations. PMID:22271609

  17. Young children's tool innovation across culture: Affordance visibility matters.

    PubMed

    Neldner, Karri; Mushin, Ilana; Nielsen, Mark

    2017-11-01

    Young children typically demonstrate low rates of tool innovation. However, previous studies have limited children's performance by presenting tools with opaque affordances. In an attempt to scaffold children's understanding of what constitutes an appropriate tool within an innovation task we compared tools in which the focal affordance was visible to those in which it was opaque. To evaluate possible cultural specificity, data collection was undertaken in a Western urban population and a remote Indigenous community. As expected affordance visibility altered innovation rates: young children were more likely to innovate on a tool that had visible affordances than one with concealed affordances. Furthermore, innovation rates were higher than those reported in previous innovation studies. Cultural background did not affect children's rates of tool innovation. It is suggested that new methods for testing tool innovation in children must be developed in order to broaden our knowledge of young children's tool innovation capabilities. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Perceived Utility of Pharmacy Licensure Examination Preparation Tools

    PubMed Central

    Peak, Amy Sutton; Sheehan, Amy Heck; Arnett, Stephanie

    2006-01-01

    Objectives To identify board examination preparation tools most commonly used by recent pharmacy graduates and determine which tools are perceived as most valuable and representative of the actual content of licensure examinations. Methods An electronic survey was sent to all 2004 graduates of colleges of pharmacy in Indiana. Participants identified which specific preparation tools were used and rated tools based on usefulness, representativeness of licensure examination, and monetary value, and provided overall recommendations to future graduates. Results The most commonly used preparation tools were the Pharmacy Law Review Session offered by Dr. Thomas Wilson at Purdue University, the Complete Review for Pharmacy, Pre-NAPLEX, PharmPrep, and the Kaplan NAPLEX Review. Tools receiving high ratings in all categories included Dr. Wilson's Pharmacy Law Review Session, Pre-NAPLEX, Comprehensive Pharmacy Review, Kaplan NAPLEX Review, and Review of Pharmacy. Conclusions Although no preparation tool was associated with a higher examination pass rate, certain tools were clearly rated higher than others by test takers. PMID:17149406

  19. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  20. Design and ergonomics. Methods for integrating ergonomics at hand tool design stage.

    PubMed

    Marsot, Jacques; Claudon, Laurent

    2004-01-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute (INRS) launched in 1999 a research project on the topic of integrating ergonomics into hand tool design, and more particularly to a design of a boning knife. After a brief recall of the difficulties of integrating ergonomics at the design stage, the present paper shows how 3 design methodological tools--Functional Analysis, Quality Function Deployment and TRIZ--have been applied to the design of a boning knife. Implementation of these tools enabled us to demonstrate the extent to which they are capable of responding to the difficulties of integrating ergonomics into product design.

  1. Advanced genetic tools for plant biotechnology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis ofmore » large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.« less

  2. Advanced genetic tools for plant biotechnology.

    PubMed

    Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal

    2013-11-01

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  3. Unsupervised automated high throughput phenotyping of RNAi time-lapse movies.

    PubMed

    Failmezger, Henrik; Fröhlich, Holger; Tresch, Achim

    2013-10-04

    Gene perturbation experiments in combination with fluorescence time-lapse cell imaging are a powerful tool in reverse genetics. High content applications require tools for the automated processing of the large amounts of data. These tools include in general several image processing steps, the extraction of morphological descriptors, and the grouping of cells into phenotype classes according to their descriptors. This phenotyping can be applied in a supervised or an unsupervised manner. Unsupervised methods are suitable for the discovery of formerly unknown phenotypes, which are expected to occur in high-throughput RNAi time-lapse screens. We developed an unsupervised phenotyping approach based on Hidden Markov Models (HMMs) with multivariate Gaussian emissions for the detection of knockdown-specific phenotypes in RNAi time-lapse movies. The automated detection of abnormal cell morphologies allows us to assign a phenotypic fingerprint to each gene knockdown. By applying our method to the Mitocheck database, we show that a phenotypic fingerprint is indicative of a gene's function. Our fully unsupervised HMM-based phenotyping is able to automatically identify cell morphologies that are specific for a certain knockdown. Beyond the identification of genes whose knockdown affects cell morphology, phenotypic fingerprints can be used to find modules of functionally related genes.

  4. Assessing the Pathogenicity of Insertion and Deletion Variants with the Variant Effect Scoring Tool (VEST‐Indel)

    PubMed Central

    Douville, Christopher; Masica, David L.; Stenson, Peter D.; Cooper, David N.; Gygax, Derek M.; Kim, Rick; Ryan, Michael

    2015-01-01

    ABSTRACT Insertion/deletion variants (indels) alter protein sequence and length, yet are highly prevalent in healthy populations, presenting a challenge to bioinformatics classifiers. Commonly used features—DNA and protein sequence conservation, indel length, and occurrence in repeat regions—are useful for inference of protein damage. However, these features can cause false positives when predicting the impact of indels on disease. Existing methods for indel classification suffer from low specificities, severely limiting clinical utility. Here, we further develop our variant effect scoring tool (VEST) to include the classification of in‐frame and frameshift indels (VEST‐indel) as pathogenic or benign. We apply 24 features, including a new “PubMed” feature, to estimate a gene's importance in human disease. When compared with four existing indel classifiers, our method achieves a drastically reduced false‐positive rate, improving specificity by as much as 90%. This approach of estimating gene importance might be generally applicable to missense and other bioinformatics pathogenicity predictors, which often fail to achieve high specificity. Finally, we tested all possible meta‐predictors that can be obtained from combining the four different indel classifiers using Boolean conjunctions and disjunctions, and derived a meta‐predictor with improved performance over any individual method. PMID:26442818

  5. Assessing the Pathogenicity of Insertion and Deletion Variants with the Variant Effect Scoring Tool (VEST-Indel).

    PubMed

    Douville, Christopher; Masica, David L; Stenson, Peter D; Cooper, David N; Gygax, Derek M; Kim, Rick; Ryan, Michael; Karchin, Rachel

    2016-01-01

    Insertion/deletion variants (indels) alter protein sequence and length, yet are highly prevalent in healthy populations, presenting a challenge to bioinformatics classifiers. Commonly used features--DNA and protein sequence conservation, indel length, and occurrence in repeat regions--are useful for inference of protein damage. However, these features can cause false positives when predicting the impact of indels on disease. Existing methods for indel classification suffer from low specificities, severely limiting clinical utility. Here, we further develop our variant effect scoring tool (VEST) to include the classification of in-frame and frameshift indels (VEST-indel) as pathogenic or benign. We apply 24 features, including a new "PubMed" feature, to estimate a gene's importance in human disease. When compared with four existing indel classifiers, our method achieves a drastically reduced false-positive rate, improving specificity by as much as 90%. This approach of estimating gene importance might be generally applicable to missense and other bioinformatics pathogenicity predictors, which often fail to achieve high specificity. Finally, we tested all possible meta-predictors that can be obtained from combining the four different indel classifiers using Boolean conjunctions and disjunctions, and derived a meta-predictor with improved performance over any individual method. © 2015 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  6. The Shock and Vibration Digest. Volume 15, Number 7

    DTIC Science & Technology

    1983-07-01

    systems noise -- for tant analytical tool, the statistical energy analysis example, from a specific metal, chain driven, con- method, has been the subject...34Experimental Determination of Vibration Parameters Re- ~~~quired in the Statistical Energy Analysis Meth- .,i. 31. Dubowsky, S. and Morris, T.L., "An...34Coupling Loss Factors for 55. Upton, R., "Sound Intensity -. A Powerful New Statistical Energy Analysis of Sound Trans- Measurement Tool," S/V, Sound

  7. Advanced Neutronics Tools for BWR Design Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Hfaiedh, N.; Letellier, R.

    2006-07-01

    This paper summarizes the developments implemented in the new APOLLO2.8 neutronics tool to meet the required target accuracy in LWR applications, particularly void effects and pin-by-pin power map in BWRs. The Method Of Characteristics was developed to allow efficient LWR assembly calculations in 2D-exact heterogeneous geometry; resonant reaction calculation was improved by the optimized SHEM-281 group mesh, which avoids resonance self-shielding approximation below 23 eV, and the new space-dependent method for resonant mixture that accounts for resonance overlapping. Furthermore, a new library CEA2005, processed from JEFF3.1 evaluations involving feedback from Critical Experiments and LWR P.I.E, is used. The specific '2005-2007more » BWR Plan' settled to demonstrate the validation/qualification of this neutronics tool is described. Some results from the validation process are presented: the comparison of APOLLO2.8 results to reference Monte Carlo TRIPOLI4 results on specific BWR benchmarks emphasizes the ability of the deterministic tool to calculate BWR assembly multiplication factor within 200 pcm accuracy for void fraction varying from 0 to 100%. The qualification process against the BASALA mock-up experiment stresses APOLLO2.8/CEA2005 performances: pin-by-pin power is always predicted within 2% accuracy, reactivity worth of B4C or Hf cruciform control blade, as well as Gd pins, is predicted within 1.2% accuracy. (authors)« less

  8. Exposure assessment in health assessments for hand-arm vibration syndrome.

    PubMed

    Mason, H J; Poole, K; Young, C

    2011-08-01

    Assessing past cumulative vibration exposure is part of assessing the risk of hand-arm vibration syndrome (HAVS) in workers exposed to hand-arm vibration and invariably forms part of a medical assessment of such workers. To investigate the strength of relationships between the presence and severity of HAVS and different cumulative exposure metrics obtained from a self-reporting questionnaire. Cumulative exposure metrics were constructed from a tool-based questionnaire applied in a group of HAVS referrals and workplace field studies. These metrics included simple years of vibration exposure, cumulative total hours of all tool use and differing combinations of acceleration magnitudes for specific tools and their daily use, including the current frequency-weighting method contained in ISO 5349-1:2001. Use of simple years of exposure is a weak predictor of HAVS or its increasing severity. The calculation of cumulative hours across all vibrating tools used is a more powerful predictor. More complex calculations based on involving likely acceleration data for specific classes of tools, either frequency weighted or not, did not offer a clear further advantage in this dataset. This may be due to the uncertainty associated with workers' recall of their past tool usage or the variability between tools in the magnitude of their vibration emission. Assessing years of exposure or 'latency' in a worker should be replaced by cumulative hours of tool use. This can be readily obtained using a tool-pictogram-based self-reporting questionnaire and a simple spreadsheet calculation.

  9. A visualization method for teaching the geometric design of highways

    DOT National Transportation Integrated Search

    2000-04-11

    In this project the authors employed state-of-the-art technology for developing visualization tools for teaching highway design. Specifically, the authors used photolog images as the basis for developing dynamic 3-D models of selected geometric eleme...

  10. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  11. Patient preference to use a questionnaire varies according to attributes.

    PubMed

    Kim, Na Yae; Richardson, Lyndsay; He, Weilin; Jones, Glenn

    2011-08-01

    Health care professionals may assume questionnaires are burdensome to patients, and this limits their use in clinical settings and promotes simplification. However, patient adherence may improve by optimizing questionnaire attributes and contexts. This cross-sectional survey used Contingent Valuation methods to directly elicit patient preference for conventional monitoring of symptoms, versus adding a tool to monitoring. Under explicit consideration was the 10-question Edmonton Symptom Assessment System (ESAS). In the questionnaire, attributes of ESAS were sequentially altered to try and force preference reversal. A separate group of participants completed both questionnaire and interviews to explore questionnaire reliability, and extend validity. Overall, 24 of 43 participants preferred using ESAS. Most important attributes to preference were frequency, specificity, and complexity. Where preference is initially against ESAS, it may reverse by simplifying the tool and its administrative processes. Interviews in 10 additional participants supported reproducibility and validity of the questionnaire method. Preference for using tools increases when tools are made relevant and used more appropriately. Questionnaires completed by patients as screening tools or aids to communication may be under-utilized. Optimization of ESAS and similar tools may be guided by empirical findings, including those obtained from Contingent Valuation methodologies. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more meaningful information that can be used in decision-making and planning. Future extensions and applications of these tools in a climate context will be considered.

  13. On the unsupervised analysis of domain-specific Chinese texts

    PubMed Central

    Deng, Ke; Bol, Peter K.; Li, Kate J.; Liu, Jun S.

    2016-01-01

    With the growing availability of digitized text data both publicly and privately, there is a great need for effective computational tools to automatically extract information from texts. Because the Chinese language differs most significantly from alphabet-based languages in not specifying word boundaries, most existing Chinese text-mining methods require a prespecified vocabulary and/or a large relevant training corpus, which may not be available in some applications. We introduce an unsupervised method, top-down word discovery and segmentation (TopWORDS), for simultaneously discovering and segmenting words and phrases from large volumes of unstructured Chinese texts, and propose ways to order discovered words and conduct higher-level context analyses. TopWORDS is particularly useful for mining online and domain-specific texts where the underlying vocabulary is unknown or the texts of interest differ significantly from available training corpora. When outputs from TopWORDS are fed into context analysis tools such as topic modeling, word embedding, and association pattern finding, the results are as good as or better than that from using outputs of a supervised segmentation method. PMID:27185919

  14. Investigation on Effect of Material Hardness in High Speed CNC End Milling Process.

    PubMed

    Dhandapani, N V; Thangarasu, V S; Sureshkannan, G

    2015-01-01

    This research paper analyzes the effects of material properties on surface roughness, material removal rate, and tool wear on high speed CNC end milling process with various ferrous and nonferrous materials. The challenge of material specific decision on the process parameters of spindle speed, feed rate, depth of cut, coolant flow rate, cutting tool material, and type of coating for the cutting tool for required quality and quantity of production is addressed. Generally, decision made by the operator on floor is based on suggested values of the tool manufacturer or by trial and error method. This paper describes effect of various parameters on the surface roughness characteristics of the precision machining part. The prediction method suggested is based on various experimental analysis of parameters in different compositions of input conditions which would benefit the industry on standardization of high speed CNC end milling processes. The results show a basis for selection of parameters to get better results of surface roughness values as predicted by the case study results.

  15. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  16. Investigation on Effect of Material Hardness in High Speed CNC End Milling Process

    PubMed Central

    Dhandapani, N. V.; Thangarasu, V. S.; Sureshkannan, G.

    2015-01-01

    This research paper analyzes the effects of material properties on surface roughness, material removal rate, and tool wear on high speed CNC end milling process with various ferrous and nonferrous materials. The challenge of material specific decision on the process parameters of spindle speed, feed rate, depth of cut, coolant flow rate, cutting tool material, and type of coating for the cutting tool for required quality and quantity of production is addressed. Generally, decision made by the operator on floor is based on suggested values of the tool manufacturer or by trial and error method. This paper describes effect of various parameters on the surface roughness characteristics of the precision machining part. The prediction method suggested is based on various experimental analysis of parameters in different compositions of input conditions which would benefit the industry on standardization of high speed CNC end milling processes. The results show a basis for selection of parameters to get better results of surface roughness values as predicted by the case study results. PMID:26881267

  17. AirLab: a cloud-based platform to manage and share antibody-based single-cell research.

    PubMed

    Catena, Raúl; Özcan, Alaz; Jacobs, Andrea; Chevrier, Stephane; Bodenmiller, Bernd

    2016-06-29

    Single-cell analysis technologies are essential tools in research and clinical diagnostics. These methods include flow cytometry, mass cytometry, and other microfluidics-based technologies. Most laboratories that employ these methods maintain large repositories of antibodies. These ever-growing collections of antibodies, their multiple conjugates, and the large amounts of data generated in assays using specific antibodies and conditions makes a dedicated software solution necessary. We have developed AirLab, a cloud-based tool with web and mobile interfaces, for the organization of these data. AirLab streamlines the processes of antibody purchase, organization, and storage, antibody panel creation, results logging, and antibody validation data sharing and distribution. Furthermore, AirLab enables inventory of other laboratory stocks, such as primers or clinical samples, through user-controlled customization. Thus, AirLab is a mobile-powered and flexible tool that harnesses the capabilities of mobile tools and cloud-based technology to facilitate inventory and sharing of antibody and sample collections and associated validation data.

  18. PepMapper: a collaborative web tool for mapping epitopes from affinity-selected peptides.

    PubMed

    Chen, Wenhan; Guo, William W; Huang, Yanxin; Ma, Zhiqiang

    2012-01-01

    Epitope mapping from affinity-selected peptides has become popular in epitope prediction, and correspondingly many Web-based tools have been developed in recent years. However, the performance of these tools varies in different circumstances. To address this problem, we employed an ensemble approach to incorporate two popular Web tools, MimoPro and Pep-3D-Search, together for taking advantages offered by both methods so as to give users more options for their specific purposes of epitope-peptide mapping. The combined operation of Union finds as many associated peptides as possible from both methods, which increases sensitivity in finding potential epitopic regions on a given antigen surface. The combined operation of Intersection achieves to some extent the mutual verification by the two methods and hence increases the likelihood of locating the genuine epitopic region on a given antigen in relation to the interacting peptides. The Consistency between Intersection and Union is an indirect sufficient condition to assess the likelihood of successful peptide-epitope mapping. On average from 27 tests, the combined operations of PepMapper outperformed either MimoPro or Pep-3D-Search alone. Therefore, PepMapper is another multipurpose mapping tool for epitope prediction from affinity-selected peptides. The Web server can be freely accessed at: http://informatics.nenu.edu.cn/PepMapper/

  19. Small Launch Vehicle Trade Space Definition: Development of a Zero Level Mass Estimation Tool with Trajectory Validation

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.

    2013-01-01

    Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.

  20. Geochemical Reaction Mechanism Discovery from Molecular Simulation

    DOE PAGES

    Stack, Andrew G.; Kent, Paul R. C.

    2014-11-10

    Methods to explore reactions using computer simulation are becoming increasingly quantitative, versatile, and robust. In this review, a rationale for how molecular simulation can help build better geochemical kinetics models is first given. We summarize some common methods that geochemists use to simulate reaction mechanisms, specifically classical molecular dynamics and quantum chemical methods and discuss their strengths and weaknesses. Useful tools such as umbrella sampling and metadynamics that enable one to explore reactions are discussed. Several case studies wherein geochemists have used these tools to understand reaction mechanisms are presented, including water exchange and sorption on aqueous species and mineralmore » surfaces, surface charging, crystal growth and dissolution, and electron transfer. The impact that molecular simulation has had on our understanding of geochemical reactivity are highlighted in each case. In the future, it is anticipated that molecular simulation of geochemical reaction mechanisms will become more commonplace as a tool to validate and interpret experimental data, and provide a check on the plausibility of geochemical kinetic models.« less

  1. GEsture: an online hand-drawing tool for gene expression pattern search.

    PubMed

    Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning

    2018-01-01

    Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.

  2. BUSCA: an integrative web server to predict subcellular localization of proteins.

    PubMed

    Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Profiti, Giuseppe; Casadio, Rita

    2018-04-30

    Here, we present BUSCA (http://busca.biocomp.unibo.it), a novel web server that integrates different computational tools for predicting protein subcellular localization. BUSCA combines methods for identifying signal and transit peptides (DeepSig and TPpred3), GPI-anchors (PredGPI) and transmembrane domains (ENSEMBLE3.0 and BetAware) with tools for discriminating subcellular localization of both globular and membrane proteins (BaCelLo, MemLoci and SChloro). Outcomes from the different tools are processed and integrated for annotating subcellular localization of both eukaryotic and bacterial protein sequences. We benchmark BUSCA against protein targets derived from recent CAFA experiments and other specific data sets, reporting performance at the state-of-the-art. BUSCA scores better than all other evaluated methods on 2732 targets from CAFA2, with a F1 value equal to 0.49 and among the best methods when predicting targets from CAFA3. We propose BUSCA as an integrated and accurate resource for the annotation of protein subcellular localization.

  3. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  4. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    PubMed

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  5. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Assessment Tools for Evaluation of Oral Feeding in Infants Less than Six Months Old

    PubMed Central

    Pados, Britt F.; Park, Jinhee; Estrem, Hayley; Awotwi, Araba

    2015-01-01

    Background Feeding difficulty is common in infants less than six months old. Identification of infants in need of specialized treatment is critical to ensure appropriate nutrition and feeding skill development. Valid and reliable assessment tools help clinicians objectively evaluate feeding. Purpose To identify and evaluate assessment tools available for clinical assessment of bottle- and breast-feeding in infants less than six months old. Methods/Search Strategy CINAHL, HaPI, PubMed, and Web of Science were searched for “infant feeding” and “assessment tool.” The literature (n=237) was reviewed for relevant assessment tools. A secondary search was conducted in CINAHL and PubMed for additional literature on identified tools. Findings/Results Eighteen assessment tools met inclusion criteria. Of these, seven were excluded because of limited available literature or because they were intended for use with a specific diagnosis or in research only. There are 11 assessment tools available for clinical practice. Only two of these were intended for bottle-feeding. All 11 indicated they were appropriate for use with breast-feeding. None of the available tools have adequate psychometric development and testing. Implications for Practice All of the tools should be used with caution. The Early Feeding Skills Assessment and Bristol Breastfeeding Assessment Tool had the most supportive psychometric development and testing. Implications for Research Feeding assessment tools need to be developed and tested to guide optimal clinical care of infants from birth through six months. A tool that assesses both bottle- and breast-feeding would allow for consistent assessment across feeding methods. PMID:26945280

  7. Vibration manual

    NASA Technical Reports Server (NTRS)

    Green, C.

    1971-01-01

    Guidelines of the methods and applications used in vibration technology at the MSFC are presented. The purpose of the guidelines is to provide a practical tool for coordination and understanding between industry and government groups concerned with vibration of systems and equipments. Topics covered include measuring, reducing, analyzing, and methods for obtaining simulated environments and formulating vibration specifications. Methods for vibration and shock testing, theoretical aspects of data processing, vibration response analysis, and techniques of designing for vibration are also presented.

  8. Assessment of SOAP note evaluation tools in colleges and schools of pharmacy.

    PubMed

    Sando, Karen R; Skoy, Elizabeth; Bradley, Courtney; Frenzel, Jeanne; Kirwin, Jennifer; Urteaga, Elizabeth

    2017-07-01

    To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Evaluation of whole genome sequencing and software tools for drug susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    van Beek, J; Haanperä, M; Smit, P W; Mentula, S; Soini, H

    2018-04-11

    Culture-based assays are currently the reference standard for drug susceptibility testing for Mycobacterium tuberculosis. They provide good sensitivity and specificity but are time consuming. The objective of this study was to evaluate whether whole genome sequencing (WGS), combined with software tools for data analysis, can replace routine culture-based assays for drug susceptibility testing of M. tuberculosis. M. tuberculosis cultures sent to the Finnish mycobacterial reference laboratory in 2014 (n = 211) were phenotypically tested by Mycobacteria Growth Indicator Tube (MGIT) for first-line drug susceptibilities. WGS was performed for all isolates using the Illumina MiSeq system, and data were analysed using five software tools (PhyResSE, Mykrobe Predictor, TB Profiler, TGS-TB and KvarQ). Diagnostic time and reagent costs were estimated for both methods. The sensitivity of the five software tools to predict any resistance among strains was almost identical, ranging from 74% to 80%, and specificity was more than 95% for all software tools except for TGS-TB. The sensitivity and specificity to predict resistance to individual drugs varied considerably among the software tools. Reagent costs for MGIT and WGS were €26 and €143 per isolate respectively. Turnaround time for MGIT was 19 days (range 10-50 days) for first-line drugs, and turnaround time for WGS was estimated to be 5 days (range 3-7 days). WGS could be used as a prescreening assay for drug susceptibility testing with confirmation of resistant strains by MGIT. The functionality and ease of use of the software tools need to be improved. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  10. Study of heat generation and cutting force according to minimization of grain size (500 nm to 180 nm) of WC ball endmill using FEM

    NASA Astrophysics Data System (ADS)

    Byeon, J. H.; Ahmed, F.; Ko, T. J.; lee, D. K.; Kim, J. S.

    2018-03-01

    As the industry develops, miniaturization and refinement of products are important issues. Precise machining is required for cutting, which is a typical method of machining a product. The factor determining the workability of the cutting process is the material of the tool. Tool materials include carbon tool steel, alloy tool steel, high-speed steel, cemented carbide, and ceramics. In the case of a carbide material, the smaller the particle size, the better the mechanical properties with higher hardness, strength and toughness. The specific heat, density, and thermal diffusivity are also changed through finer particle size of the material. In this study, finite element analysis was performed to investigate the change of heat generation and cutting power depending on the physical properties (specific heat, density, thermal diffusivity) of tool material. The thermal conductivity coefficient was obtained by measuring the thermal diffusivity, specific heat, and density of the material (180 nm) in which the particle size was finer and the particle material (0.05 μm) in the conventional size. The coefficient of thermal conductivity was calculated as 61.33 for 180nm class material and 46.13 for 0.05μm class material. As a result of finite element analysis using this value, the average temperature of exothermic heat of micronized particle material (180nm) was 532.75 °C and the temperature of existing material (0.05μm) was 572.75 °C. Cutting power was also compared but not significant. Therefore, if the thermal conductivity is increased through particle refinement, the surface power can be improved and the tool life can be prolonged by lowering the temperature generated in the tool during machining without giving a great influence to the cutting power.

  11. A simple web-based tool to compare freshwater fish data collected using AFS standard methods

    USGS Publications Warehouse

    Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill

    2016-01-01

    The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.

  12. Deriving protection thresholds for threatened and endangered species potentially exposed to pesticides

    EPA Science Inventory

    The Endangered Species Act requires specific and stringent protection to threatened and endangered species and their critical habitat. Therefore, protective methods for risk assessment for such species are needed. Species sensitivity distributions (SSDs) are a common tool used fo...

  13. SYRCLE’s risk of bias tool for animal studies

    PubMed Central

    2014-01-01

    Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063

  14. Computer Simulation of Replaceable Many Sider Plates (RMSP) with Enhanced Chip-Breaking Characteristics

    NASA Astrophysics Data System (ADS)

    Korchuganova, M.; Syrbakov, A.; Chernysheva, T.; Ivanov, G.; Gnedasch, E.

    2016-08-01

    Out of all common chip curling methods, a special tool face form has become the most widespread which is developed either by means of grinding or by means of profile pressing in the production process of RMSP. Currently, over 15 large tool manufacturers produce tools using instrument materials of over 500 brands. To this, we must add a large variety of tool face geometries, which purpose includes the control over form and dimensions of the chip. Taking into account all the many processed materials, specific tasks of the process planner, requirements to the quality of manufactured products, all this makes the choice of a proper tool which can perform the processing in the most effective way significantly harder. Over recent years, the nomenclature of RMSP for lathe tools with mechanical mounting has been considerably broadened by means of diversification of their faces

  15. Using Petri Net Tools to Study Properties and Dynamics of Biological Systems

    PubMed Central

    Peleg, Mor; Rubin, Daniel; Altman, Russ B.

    2005-01-01

    Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791

  16. Engaging Patients as Partners in Developing Patient-Reported Outcome Measures in Cancer-A Review of the Literature.

    PubMed

    Camuso, Natasha; Bajaj, Prerna; Dudgeon, Deborah; Mitera, Gunita

    2016-08-01

    Tools to collect patient-reported outcome measures (PROMs) are frequently used in the healthcare setting to collect information that is most meaningful to patients. Due to discordance among how patients and healthcare providers rank symptoms that are considered most meaningful to the patient, engagement of patients in the development of PROMs is extremely important. This review aimed to identify studies that described how patients are involved in the item generation stage of cancer-specific PROM tools developed for cancer patients. A literature search was conducted using keywords relevant to PROMs, cancer, and patient engagement. A manual search of relevant reference lists was also conducted. Inclusion criteria stipulated that publications must describe patient engagement in the item generation stage of development of cancer-specific PROM tools. Results were excluded if they were duplicate findings or non-English. The initial search yielded 230 publications. After removal of duplicates and review of publications, 6 were deemed relevant. Fourteen additional publications were retrieved through a manual search of references from relevant publications. A total of 13 unique PROM tools that included patient input in item generation were identified. The most common method of patient engagement was through qualitative interviews or focus groups. Despite recommendations from international groups and the emphasized importance of incorporating patient feedback in all stages of development of PROMs, few unique tools have incorporated patient input in item generation of cancer-specific tools. Moving forward, a framework of best practices on how to best engage patients in developing PROMs is warranted to support high-quality patient-centered care.

  17. Molecular testing for clinical diagnosis and epidemiological investigations of intestinal parasitic infections.

    PubMed

    Verweij, Jaco J; Stensvold, C Rune

    2014-04-01

    Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies.

  18. Molecular Testing for Clinical Diagnosis and Epidemiological Investigations of Intestinal Parasitic Infections

    PubMed Central

    Stensvold, C. Rune

    2014-01-01

    SUMMARY Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies. PMID:24696439

  19. Why are Formal Methods Not Used More Widely?

    NASA Technical Reports Server (NTRS)

    Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.

    1997-01-01

    Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.

  20. An evaluation of copy number variation detection tools for cancer using whole exome sequencing data.

    PubMed

    Zare, Fatima; Dow, Michelle; Monteleone, Nicholas; Hosny, Abdelrahman; Nabavi, Sheida

    2017-05-31

    Recently copy number variation (CNV) has gained considerable interest as a type of genomic/genetic variation that plays an important role in disease susceptibility. Advances in sequencing technology have created an opportunity for detecting CNVs more accurately. Recently whole exome sequencing (WES) has become primary strategy for sequencing patient samples and study their genomics aberrations. However, compared to whole genome sequencing, WES introduces more biases and noise that make CNV detection very challenging. Additionally, tumors' complexity makes the detection of cancer specific CNVs even more difficult. Although many CNV detection tools have been developed since introducing NGS data, there are few tools for somatic CNV detection for WES data in cancer. In this study, we evaluated the performance of the most recent and commonly used CNV detection tools for WES data in cancer to address their limitations and provide guidelines for developing new ones. We focused on the tools that have been designed or have the ability to detect cancer somatic aberrations. We compared the performance of the tools in terms of sensitivity and false discovery rate (FDR) using real data and simulated data. Comparative analysis of the results of the tools showed that there is a low consensus among the tools in calling CNVs. Using real data, tools show moderate sensitivity (~50% - ~80%), fair specificity (~70% - ~94%) and poor FDRs (~27% - ~60%). Also, using simulated data we observed that increasing the coverage more than 10× in exonic regions does not improve the detection power of the tools significantly. The limited performance of the current CNV detection tools for WES data in cancer indicates the need for developing more efficient and precise CNV detection methods. Due to the complexity of tumors and high level of noise and biases in WES data, employing advanced novel segmentation, normalization and de-noising techniques that are designed specifically for cancer data is necessary. Also, CNV detection development suffers from the lack of a gold standard for performance evaluation. Finally, developing tools with user-friendly user interfaces and visualization features can enhance CNV studies for a broader range of users.

  1. Resilient Software Systems

    DTIC Science & Technology

    2015-06-01

    and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create

  2. PyHLA: tests for the association between HLA alleles and diseases.

    PubMed

    Fan, Yanhui; Song, You-Qiang

    2017-02-06

    Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.

  3. Helping coaches apply the principles of representative learning design: validation of a tennis specific practice assessment tool.

    PubMed

    Krause, Lyndon; Farrow, Damian; Reid, Machar; Buszard, Tim; Pinder, Ross

    2018-06-01

    Representative Learning Design (RLD) is a framework for assessing the degree to which experimental or practice tasks simulate key aspects of specific performance environments (i.e. competition). The key premise being that when practice replicates the performance environment, skills are more likely to transfer. In applied situations, however, there is currently no simple or quick method for coaches to assess the key concepts of RLD (e.g. during on-court tasks). The aim of this study was to develop a tool for coaches to efficiently assess practice task design in tennis. A consensus-based tool was developed using a 4-round Delphi process with 10 academic and 13 tennis-coaching experts. Expert consensus was reached for the inclusion of seven items, each consisting of two sub-questions related to (i) the task goal and (ii) the relevance of the task to competition performance. The Representative Practice Assessment Tool (RPAT) is proposed for use in assessing and enhancing practice task designs in tennis to increase the functional coupling between information and movement, and to maximise the potential for skill transfer to competition contexts.

  4. Concordance between actual and pharmacogenetic predicted desvenlafaxine dose needed to achieve remission in major depressive disorder: a 10-week open-label study

    PubMed Central

    Müller, Daniel J.; Ng, Chee H.; Byron, Keith; Berk, Michael; Singh, Ajeet B.

    2017-01-01

    Background Pharmacogenetic-based dosing support tools have been developed to personalize antidepressant-prescribing practice. However, the clinical validity of these tools has not been adequately tested, particularly for specific antidepressants. Objective To examine the concordance between the actual dose and a polygene pharmacogenetic predicted dose of desvenlafaxine needed to achieve symptom remission. Materials and methods A 10-week, open-label, prospective trial of desvenlafaxine among Caucasian adults with major depressive disorder (n=119) was conducted. Dose was clinically adjusted and at the completion of the trial, the clinical dose needed to achieve remission was compared with the predicted dose needed to achieve remission. Results Among remitters (n=95), there was a strong concordance (Kendall’s τ-b=0.84, P=0.0001; Cohen’s κ=0.82, P=0.0001) between the actual and the predicted dose need to achieve symptom remission, showing high sensitivity (≥85%), specificity (≥86%), and accuracy (≥89%) of the tool. Conclusion Findings provide initial evidence for the clinical validity of a polygene pharmacogenetic-based tool for desvenlafaxine dosing. PMID:27779571

  5. Early craniometric tools as a predecessor to neurosurgical stereotaxis.

    PubMed

    Serletis, Demitre; Pait, T Glenn

    2016-06-01

    In this paper the authors trace the history of early craniometry, referring to the technique of obtaining cranial measurements for the accurate correlation of external skull landmarks to specific brain regions. Largely drawing on methods from the newly emerging fields of physical anthropology and phrenology in the late 19th and early 20th centuries, basic mathematical concepts were combined with simplistic (yet at the time, innovative) mechanical tools, leading to the first known attempts at craniocerebral topography. It is important to acknowledge the pioneers of this pre-imaging epoch, who applied creativity and ingenuity to tackle the challenge of reproducibly and reliably accessing a specific target in the brain. In particular, with the emergence of Broca's theory of cortical localization, in vivo craniometric tools, and the introduction of 3D coordinate systems, several innovative devices were conceived that subsequently paved the way for modern-day stereotactic techniques. In this context, the authors present a comprehensive and systematic review of the most popular craniometric tools developed during this time period (prior to the stereotactic era) for the purposes of craniocerebral measurement and target localization.

  6. Utopia Providing Trusted Social Network Relationships within an Un-trusted Environment

    NASA Astrophysics Data System (ADS)

    Gauvin, William; Liu, Benyuan; Fu, Xinwen; Wang, Jie

    This paper introduces an unobtrusive method and distributed solution set to aid users of on-line social networking sites, by creating a trusted environment in which every member has the ability to identify each other within their private social network by name, gender, age, location, and the specific usage patterns adopted by the group. Utopia protects members by understanding how the social network is created and the specific aspects of the group that make it unique and identifiable. The main focus of Utopia is the protection of the group, and their privacy within a social network from predators and spammers that characteristically do not fit within the well defined usage boundaries of the social network as a whole. The solution set provides defensive, as well as offensive tools to identify these threats. Once identified, client desktop tools are used to prevent these predators from further interaction within the group. In addition, offensive tools are used to determine the origin of the predator to allow actions to be taken by automated tools and law enforcement to alleviate the threat.

  7. The Rapid-Heat LAMPellet Method: A Potential Diagnostic Method for Human Urogenital Schistosomiasis

    PubMed Central

    Carranza-Rodríguez, Cristina; Pérez-Arellano, José Luis; Vicente, Belén; López-Abán, Julio; Muro, Antonio

    2015-01-01

    Background Urogenital schistosomiasis due to Schistosoma haematobium is a serious underestimated public health problem affecting 112 million people - particularly in sub-Saharan Africa. Microscopic examination of urine samples to detect parasite eggs still remains as definitive diagnosis. This work was focussed on developing a novel loop-mediated isothermal amplification (LAMP) assay for detection of S. haematobium DNA in human urine samples as a high-throughput, simple, accurate and affordable diagnostic tool to use in diagnosis of urogenital schistosomiasis. Methodology/Principal Findings A LAMP assay targeting a species specific sequence of S. haematobium ribosomal intergenic spacer was designed. The effectiveness of our LAMP was assessed in a number of patients´ urine samples with microscopy confirmed S. haematobium infection. For potentially large-scale application in field conditions, different DNA extraction methods, including a commercial kit, a modified NaOH extraction method and a rapid heating method were tested using small volumes of urine fractions (whole urine, supernatants and pellets). The heating of pellets from clinical samples was the most efficient method to obtain good-quality DNA detectable by LAMP. The detection limit of our LAMP was 1 fg/µL of S. haematobium DNA in urine samples. When testing all patients´ urine samples included in our study, diagnostic parameters for sensitivity and specificity were calculated for LAMP assay, 100% sensitivity (95% CI: 81.32%-100%) and 86.67% specificity (95% CI: 75.40%-94.05%), and also for microscopy detection of eggs in urine samples, 69.23% sensitivity (95% CI: 48.21% -85.63%) and 100% specificity (95% CI: 93.08%-100%). Conclusions/Significance We have developed and evaluated, for the first time, a LAMP assay for detection of S. haematobium DNA in heated pellets from patients´ urine samples using no complicated requirement procedure for DNA extraction. The procedure has been named the Rapid-Heat LAMPellet method and has the potential to be developed further as a field diagnostic tool for use in urogenital schistosomiasis-endemic areas. PMID:26230990

  8. Real-time forecasts of tomorrow's earthquakes in California: a new mapping tool

    USGS Publications Warehouse

    Gerstenberger, Matt; Wiemer, Stefan; Jones, Lucy

    2004-01-01

    We have derived a multi-model approach to calculate time-dependent earthquake hazard resulting from earthquake clustering. This file report explains the theoretical background behind the approach, the specific details that are used in applying the method to California, as well as the statistical testing to validate the technique. We have implemented our algorithm as a real-time tool that has been automatically generating short-term hazard maps for California since May of 2002, at http://step.wr.usgs.gov

  9. Automated real-time software development

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.

    1993-01-01

    A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.

  10. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.

    1984-01-01

    The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.

  11. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions

    PubMed Central

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression. PMID:25206321

  12. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions.

    PubMed

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600-700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression.

  13. Method selection for sustainability assessments: The case of recovery of resources from waste water.

    PubMed

    Zijp, M C; Waaijers-van der Loop, S L; Heijungs, R; Broeren, M L M; Peeters, R; Van Nieuwenhuijzen, A; Shen, L; Heugens, E H W; Posthuma, L

    2017-07-15

    Sustainability assessments provide scientific support in decision procedures towards sustainable solutions. However, in order to contribute in identifying and choosing sustainable solutions, the sustainability assessment has to fit the decision context. Two complicating factors exist. First, different stakeholders tend to have different views on what a sustainability assessment should encompass. Second, a plethora of sustainability assessment methods exist, due to the multi-dimensional characteristic of the concept. Different methods provide other representations of sustainability. Based on a literature review, we present a protocol to facilitate method selection together with stakeholders. The protocol guides the exploration of i) the decision context, ii) the different views of stakeholders and iii) the selection of pertinent assessment methods. In addition, we present an online tool for method selection. This tool identifies assessment methods that meet the specifications obtained with the protocol, and currently contains characteristics of 30 sustainability assessment methods. The utility of the protocol and the tool are tested in a case study on the recovery of resources from domestic waste water. In several iterations, a combination of methods was selected, followed by execution of the selected sustainability assessment methods. The assessment results can be used in the first phase of the decision procedure that leads to a strategic choice for sustainable resource recovery from waste water in the Netherlands. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Recent Advances in Genome Editing Using CRISPR/Cas9.

    PubMed

    Ding, Yuduan; Li, Hong; Chen, Ling-Ling; Xie, Kabin

    2016-01-01

    The CRISPR (clustered regularly interspaced short palindromic repeat)-Cas9 (CRISPR-associated nuclease 9) system is a versatile tool for genome engineering that uses a guide RNA (gRNA) to target Cas9 to a specific sequence. This simple RNA-guided genome-editing technology has become a revolutionary tool in biology and has many innovative applications in different fields. In this review, we briefly introduce the Cas9-mediated genome-editing method, summarize the recent advances in CRISPR/Cas9 technology, and discuss their implications for plant research. To date, targeted gene knockout using the Cas9/gRNA system has been established in many plant species, and the targeting efficiency and capacity of Cas9 has been improved by optimizing its expression and that of its gRNA. The CRISPR/Cas9 system can also be used for sequence-specific mutagenesis/integration and transcriptional control of target genes. We also discuss off-target effects and the constraint that the protospacer-adjacent motif (PAM) puts on CRISPR/Cas9 genome engineering. To address these problems, a number of bioinformatic tools are available to help design specific gRNAs, and new Cas9 variants and orthologs with high fidelity and alternative PAM specificities have been engineered. Owing to these recent efforts, the CRISPR/Cas9 system is becoming a revolutionary and flexible tool for genome engineering. Adoption of the CRISPR/Cas9 technology in plant research would enable the investigation of plant biology at an unprecedented depth and create innovative applications in precise crop breeding.

  15. Rapid Identification of Pseudallescheria and Scedosporium Strains by Using Rolling Circle Amplification

    PubMed Central

    Lackner, Michaela; Najafzadeh, Mohammad Javad; Sun, Jiufeng; Lu, Qiaoyun

    2012-01-01

    The Pseudallescheria boydii complex, comprising environmental pathogens with Scedosporium anamorphs, has recently been subdivided into five main species: Scedosporium dehoogii, S. aurantiacum, Pseudallescheria minutispora, P. apiosperma, and P. boydii, while the validity of some other taxa is being debated. Several Pseudallescheria and Scedosporium species are indicator organisms of pollution in soil and water. Scedosporium dehoogii in particular is enriched in soils contaminated by aliphatic hydrocarbons. In addition, the fungi may cause life-threatening infections involving the central nervous system in severely impaired patients. For screening purposes, rapid and economic tools for species recognition are needed. Our aim is to establish rolling circle amplification (RCA) as a screening tool for species-specific identification of Pseudallescheria and Scedosporium. With this aim, a set of padlock probes was designed on the basis of the internal transcribed spacer (ITS) region, differing by up to 13 fixed mutations. Padlock probes were unique as judged from sequence comparison by BLAST search in GenBank and in dedicated research databases at CBS (Centraalbureau voor Schimmelcultures Fungal Biodiversity Centre). RCA was applied as an in vitro tool, tested with pure DNA amplified from cultures. The species-specific padlock probes designed in this study yielded 100% specificity. The method presented here was found to be an attractive alternative to identification by restriction fragment length polymorphism (RFLP) or sequencing. The rapidity (<1 day), specificity, and low costs make RCA a promising screening tool for environmentally and clinically relevant fungi. PMID:22057865

  16. Rapid identification of Pseudallescheria and Scedosporium strains by using rolling circle amplification.

    PubMed

    Lackner, Michaela; Najafzadeh, Mohammad Javad; Sun, Jiufeng; Lu, Qiaoyun; Hoog, G Sybren de

    2012-01-01

    The Pseudallescheria boydii complex, comprising environmental pathogens with Scedosporium anamorphs, has recently been subdivided into five main species: Scedosporium dehoogii, S. aurantiacum, Pseudallescheria minutispora, P. apiosperma, and P. boydii, while the validity of some other taxa is being debated. Several Pseudallescheria and Scedosporium species are indicator organisms of pollution in soil and water. Scedosporium dehoogii in particular is enriched in soils contaminated by aliphatic hydrocarbons. In addition, the fungi may cause life-threatening infections involving the central nervous system in severely impaired patients. For screening purposes, rapid and economic tools for species recognition are needed. Our aim is to establish rolling circle amplification (RCA) as a screening tool for species-specific identification of Pseudallescheria and Scedosporium. With this aim, a set of padlock probes was designed on the basis of the internal transcribed spacer (ITS) region, differing by up to 13 fixed mutations. Padlock probes were unique as judged from sequence comparison by BLAST search in GenBank and in dedicated research databases at CBS (Centraalbureau voor Schimmelcultures Fungal Biodiversity Centre). RCA was applied as an in vitro tool, tested with pure DNA amplified from cultures. The species-specific padlock probes designed in this study yielded 100% specificity. The method presented here was found to be an attractive alternative to identification by restriction fragment length polymorphism (RFLP) or sequencing. The rapidity (<1 day), specificity, and low costs make RCA a promising screening tool for environmentally and clinically relevant fungi.

  17. Taenia asiatica: the most neglected human Taenia and the possibility of cysticercosis.

    PubMed

    Galán-Puchades, M Teresa; Fuentes, Mario V

    2013-02-01

    Not only Taenia solium and Taenia saginata, but also Taenia asiatica infects humans. The last species is not included in the evaluation of the specificity of the immunodiagnostic techniques for taeniasis/cysticercosis. There is currently no specific immunodiagnostic method for T. asiatica available. Therefore, due to the fact that molecular techniques (the only tool to distinguish the 3 Taenia species) are normally not employed in routine diagnostic methods, the 2 questions concerning T. asiatica (its definite geographic distribution and its ability to cause human cysticercosis), remain open, turning T. asiatica into the most neglected agent of human taeniasis-cysticercosis.

  18. miREE: miRNA recognition elements ensemble

    PubMed Central

    2011-01-01

    Background Computational methods for microRNA target prediction are a fundamental step to understand the miRNA role in gene regulation, a key process in molecular biology. In this paper we present miREE, a novel microRNA target prediction tool. miREE is an ensemble of two parts entailing complementary but integrated roles in the prediction. The Ab-Initio module leverages upon a genetic algorithmic approach to generate a set of candidate sites on the basis of their microRNA-mRNA duplex stability properties. Then, a Support Vector Machine (SVM) learning module evaluates the impact of microRNA recognition elements on the target gene. As a result the prediction takes into account information regarding both miRNA-target structural stability and accessibility. Results The proposed method significantly improves the state-of-the-art prediction tools in terms of accuracy with a better balance between specificity and sensitivity, as demonstrated by the experiments conducted on several large datasets across different species. miREE achieves this result by tackling two of the main challenges of current prediction tools: (1) The reduced number of false positives for the Ab-Initio part thanks to the integration of a machine learning module (2) the specificity of the machine learning part, obtained through an innovative technique for rich and representative negative records generation. The validation was conducted on experimental datasets where the miRNA:mRNA interactions had been obtained through (1) direct validation where even the binding site is provided, or through (2) indirect validation, based on gene expression variations obtained from high-throughput experiments where the specific interaction is not validated in detail and consequently the specific binding site is not provided. Conclusions The coupling of two parts: a sensitive Ab-Initio module and a selective machine learning part capable of recognizing the false positives, leads to an improved balance between sensitivity and specificity. miREE obtains a reasonable trade-off between filtering false positives and identifying targets. miREE tool is available online at http://didattica-online.polito.it/eda/miREE/ PMID:22115078

  19. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    PubMed

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. [Andragogy: reality or utopy].

    PubMed

    Wautier, J L; Vileyn, F

    2004-07-01

    The education of adult differs from that of children and the methods, which have to be used, should take into account that adults have specific goals and diverse knowledge. As the teaching methods for children are called pedagogy, it is now known as andragogy for adults. Andragogy has lead to the development of several approaches to improve continuous education. Several tools and methodologies have been created for adult education.

  1. WASP: a Web-based Allele-Specific PCR assay designing tool for detecting SNPs and mutations

    PubMed Central

    Wangkumhang, Pongsakorn; Chaichoompu, Kridsadakorn; Ngamphiw, Chumpol; Ruangrit, Uttapong; Chanprasert, Juntima; Assawamakin, Anunchai; Tongsima, Sissades

    2007-01-01

    Background Allele-specific (AS) Polymerase Chain Reaction is a convenient and inexpensive method for genotyping Single Nucleotide Polymorphisms (SNPs) and mutations. It is applied in many recent studies including population genetics, molecular genetics and pharmacogenomics. Using known AS primer design tools to create primers leads to cumbersome process to inexperience users since information about SNP/mutation must be acquired from public databases prior to the design. Furthermore, most of these tools do not offer the mismatch enhancement to designed primers. The available web applications do not provide user-friendly graphical input interface and intuitive visualization of their primer results. Results This work presents a web-based AS primer design application called WASP. This tool can efficiently design AS primers for human SNPs as well as mutations. To assist scientists with collecting necessary information about target polymorphisms, this tool provides a local SNP database containing over 10 million SNPs of various populations from public domain databases, namely NCBI dbSNP, HapMap and JSNP respectively. This database is tightly integrated with the tool so that users can perform the design for existing SNPs without going off the site. To guarantee specificity of AS primers, the proposed system incorporates a primer specificity enhancement technique widely used in experiment protocol. In particular, WASP makes use of different destabilizing effects by introducing one deliberate 'mismatch' at the penultimate (second to last of the 3'-end) base of AS primers to improve the resulting AS primers. Furthermore, WASP offers graphical user interface through scalable vector graphic (SVG) draw that allow users to select SNPs and graphically visualize designed primers and their conditions. Conclusion WASP offers a tool for designing AS primers for both SNPs and mutations. By integrating the database for known SNPs (using gene ID or rs number), this tool facilitates the awkward process of getting flanking sequences and other related information from public SNP databases. It takes into account the underlying destabilizing effect to ensure the effectiveness of designed primers. With user-friendly SVG interface, WASP intuitively presents resulting designed primers, which assist users to export or to make further adjustment to the design. This software can be freely accessed at . PMID:17697334

  2. A novel method for pair-matching using three-dimensional digital models of bone: mesh-to-mesh value comparison.

    PubMed

    Karell, Mara A; Langstaff, Helen K; Halazonetis, Demetrios J; Minghetti, Caterina; Frelat, Mélanie; Kranioti, Elena F

    2016-09-01

    The commingling of human remains often hinders forensic/physical anthropologists during the identification process, as there are limited methods to accurately sort these remains. This study investigates a new method for pair-matching, a common individualization technique, which uses digital three-dimensional models of bone: mesh-to-mesh value comparison (MVC). The MVC method digitally compares the entire three-dimensional geometry of two bones at once to produce a single value to indicate their similarity. Two different versions of this method, one manual and the other automated, were created and then tested for how well they accurately pair-matched humeri. Each version was assessed using sensitivity and specificity. The manual mesh-to-mesh value comparison method was 100 % sensitive and 100 % specific. The automated mesh-to-mesh value comparison method was 95 % sensitive and 60 % specific. Our results indicate that the mesh-to-mesh value comparison method overall is a powerful new tool for accurately pair-matching commingled skeletal elements, although the automated version still needs improvement.

  3. Cognitive assessment: A challenge for occupational therapists in Brazil

    PubMed Central

    Conti, Juliana

    2017-01-01

    Cognitive impairment is a common dysfunction after neurological injury. Cognitive assessment tools can help the therapist understand how impairments are affecting functional status and quality of life. Objective The aim of the study was to identify instruments for cognitive assessment that Occupational Therapists (OT) can use in clinical practice. Methods The instruments published in English and Portuguese between 1999 and 2016 were systematically reviewed. Results The search identified 17 specific instruments for OT not validated in Brazilian Portuguese, 10 non-specific instruments for OT not validated in Brazilian Portuguese, and 25 instruments validated for Portuguese, only one of which was specific for OT (Lowenstein Occupational Therapy Cognitive Assessment). Conclusion There are few assessment cognitive tools validated for use in the Brazilian culture and language. The majority of the instruments appear not to be validated for use by OT in clinical practice. PMID:29213503

  4. The Power of Computer-aided Tomography to Investigate Marine Benthic Communities

    EPA Science Inventory

    Utilization of Computer-aided-Tomography (CT) technology is a powerful tool to investigate benthic communities in aquatic systems. In this presentation, we will attempt to summarize our 15 years of experience in developing specific CT methods and applications to marine benthic co...

  5. Progress in Multi-Disciplinary Data Life Cycle Management

    NASA Astrophysics Data System (ADS)

    Jung, C.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.

    2015-12-01

    Modern science is most often driven by data. Improvements in state-of-the-art technologies and methods in many scientific disciplines lead not only to increasing data rates, but also to the need to improve or even completely overhaul their data life cycle management. Communities usually face two kinds of challenges: generic ones like federated authorization and authentication infrastructures and data preservation, and ones that are specific to their community and their respective data life cycle. In practice, the specific requirements often hinder the use of generic tools and methods. The German Helmholtz Association project ’’Large-Scale Data Management and Analysis” (LSDMA) addresses both challenges: its five Data Life Cycle Labs (DLCLs) closely collaborate with communities in joint research and development to optimize the communities data life cycle management, while its Data Services Integration Team (DSIT) provides generic data tools and services. We present most recent developments and results from the DLCLs covering communities ranging from heavy ion physics and photon science to high-throughput microscopy, and from DSIT.

  6. Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry applied to virus identification

    PubMed Central

    Calderaro, Adriana; Arcangeletti, Maria-Cristina; Rodighiero, Isabella; Buttrini, Mirko; Gorrini, Chiara; Motta, Federica; Germini, Diego; Medici, Maria-Cristina; Chezzi, Carlo; De Conto, Flora

    2014-01-01

    Virus detection and/or identification traditionally rely on methods based on cell culture, electron microscopy and antigen or nucleic acid detection. These techniques are good, but often expensive and/or time-consuming; furthermore, they not always lead to virus identification at the species and/or type level. In this study, Matrix-Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry (MALDI-TOF MS) was tested as an innovative tool to identify human polioviruses and to identify specific viral protein biomarkers in infected cells. The results revealed MALDI-TOF MS to be an effective and inexpensive tool for the identification of the three poliovirus serotypes. The method was firstly applied to Sabin reference strains, and then to isolates from different clinical samples, highlighting its value as a time-saving, sensitive and specific technique when compared to the gold standard neutralization assay and casting new light on its possible application to virus detection and/or identification. PMID:25354905

  7. Simplified estimation of age-specific reference intervals for skewed data.

    PubMed

    Wright, E M; Royston, P

    1997-12-30

    Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.

  8. Analysis of visual quality improvements provided by known tools for HDR content

    NASA Astrophysics Data System (ADS)

    Kim, Jaehwan; Alshina, Elena; Lee, JongSeok; Park, Youngo; Choi, Kwang Pyo

    2016-09-01

    In this paper, the visual quality of different solutions for high dynamic range (HDR) compression using MPEG test contents is analyzed. We also simulate the method for an efficient HDR compression which is based on statistical property of the signal. The method is compliant with HEVC specification and also easily compatible with other alternative methods which might require HEVC specification changes. It was subjectively tested on commercial TVs and compared with alternative solutions for HDR coding. Subjective visual quality tests were performed using SUHD TVs model which is SAMSUNG JS9500 with maximum luminance up to 1000nit in test. The solution that is based on statistical property shows not only improvement of objective performance but improvement of visual quality compared to other HDR solutions, while it is compatible with HEVC specification.

  9. The visibility of QSEN competencies in clinical assessment tools in Swedish nurse education.

    PubMed

    Nygårdh, Annette; Sherwood, Gwen; Sandberg, Therese; Rehn, Jeanette; Knutsson, Susanne

    2017-12-01

    Prospective nurses need specific and sufficient knowledge to be able to provide quality care. The Swedish Society of Nursing has emphasized the importance of the six quality and safety competencies (QSEN), originated in the US, in Swedish nursing education. To investigate the visibility of the QSEN competencies in the assessment tools used in clinical practice METHOD: A quantitative descriptive method was used to analyze assessment tools from 23 universities. Teamwork and collaboration was the most visible competency. Patient-centered care was visible to a large degree but was not referred to by name. Informatics was the least visible, a notable concern since all nurses should be competent in informatics to provide quality and safety in care. These results provide guidance as academic and clinical programs around the world implement assessment of how well nurses have developed these essential quality and safety competencies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Strategies and tools for whole genome alignments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couronne, Olivier; Poliakov, Alexander; Bray, Nicolas

    2002-11-25

    The availability of the assembled mouse genome makespossible, for the first time, an alignment and comparison of two largevertebrate genomes. We have investigated different strategies ofalignment for the subsequent analysis of conservation of genomes that areeffective for different quality assemblies. These strategies were appliedto the comparison of the working draft of the human genome with the MouseGenome Sequencing Consortium assembly, as well as other intermediatemouse assemblies. Our methods are fast and the resulting alignmentsexhibit a high degree of sensitivity, covering more than 90 percent ofknown coding exons in the human genome. We have obtained such coveragewhile preserving specificity. With amore » view towards the end user, we havedeveloped a suite of tools and websites for automatically aligning, andsubsequently browsing and working with whole genome comparisons. Wedescribe the use of these tools to identify conserved non-coding regionsbetween the human and mouse genomes, some of which have not beenidentified by other methods.« less

  11. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  12. A highly specific competitive direct enzyme immunoassay for sterigmatocystin as a tool for rapid immunochemotaxonomic differentiation of mycotoxigenic Aspergillus species.

    PubMed

    Wegner, S; Bauer, J I; Dietrich, R; Märtlbauer, E; Usleber, E; Gottschalk, C; Gross, M

    2017-02-01

    A simplified method to produce specific polyclonal rabbit antibodies against sterigmatocystin (STC) was established, using a STC-glycolic acid-ether derivative (STC-GE) conjugated to keyhole limpet haemocyanin (immunogen). The competitive direct enzyme immunoassay (EIA) established for STC had a detection limit (20% binding inhibition) of 130 pg ml -1 . The test was highly specific for STC, with minor cross-reactivity with O-methylsterigmatocystin (OMSTC, 0·87%) and negligible reactivity with aflatoxins (<0·02%). STC-EIA was used in combination with a previously developed specific EIA for aflatoxins (<0·1% cross-reactivity with STC and OMSTC), to study the STC/aflatoxin production profiles of reference strains of Aspergillus species. This immunochemotaxonomic procedure was found to be a convenient tool to identify STC- or aflatoxin-producing strains. The carcinogenic mycotoxin sterigmatocystin (STC) is produced by several Aspergillus species, either alone or together with aflatoxins. Here, we report a very simple and straightforward procedure to obtain highly sensitive and specific anti-STC antibodies, and their use in the first ever real STC-specific competitive direct enzyme immunoassay (EIA). In combination with a previous EIA for aflatoxins, this study for the first time demonstrates the potential of a STC/aflatoxin EIA pair for what is branded as 'immunochemotaxonomic' identification of mycotoxigenic Aspergillus species. This new analytical tool enhances analytical possibilities for differential analysis of STC and aflatoxins. © 2016 The Society for Applied Microbiology.

  13. Isolation, amplification and characterization of foodborne pathogen disease bacteria gene for rapid kit test development

    NASA Astrophysics Data System (ADS)

    Nurjayadi, M.; Santoso, I.; Kartika, I. R.; Kurniadewi, F.; Saamia, V.; Sofihan, W.; Nurkhasanah, D.

    2017-07-01

    There is a lot of public concern over food safety. Food-safety cases recently, including many food poisoning cases in both the developed and developing countries, considered to be the national security threats which involved police investigation. Quick and accurate detection methods are needed to handle the food poisoning cases with a big number of sufferers at the same time. Therefore, the research is aimed to develop a specific, sensitive, and rapid result molecular detection tool for foodborne pathogen bacteria. We, thus, propose genomic level approach with Polymerase Chain Reaction. The research has successfully produced a specific primer to perform amplification to fim-C S. typhi, E. coli, and pef Salmonella typhimurium genes. The electrophoresis result shows that amplification products are 95 base pairs, 121 base pairs, and 139 base pairs; and all three genes are in accordance with the size of the in silico to third genes bacteria. In conclusion, the research has been successfully designed a specific detection tool to three foodborne pathogen bacteria genes. Further stages test and the uses of Real-time PCR in the detection are still in the trial process for better detection method.

  14. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    PubMed

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  15. GIANT API: an application programming interface for functional genomics.

    PubMed

    Roberts, Andrew M; Wong, Aaron K; Fisk, Ian; Troyanskaya, Olga G

    2016-07-08

    GIANT API provides biomedical researchers programmatic access to tissue-specific and global networks in humans and model organisms, and associated tools, which includes functional re-prioritization of existing genome-wide association study (GWAS) data. Using tissue-specific interaction networks, researchers are able to predict relationships between genes specific to a tissue or cell lineage, identify the changing roles of genes across tissues and uncover disease-gene associations. Additionally, GIANT API enables computational tools like NetWAS, which leverages tissue-specific networks for re-prioritization of GWAS results. The web services covered by the API include 144 tissue-specific functional gene networks in human, global functional networks for human and six common model organisms and the NetWAS method. GIANT API conforms to the REST architecture, which makes it stateless, cacheable and highly scalable. It can be used by a diverse range of clients including web browsers, command terminals, programming languages and standalone apps for data analysis and visualization. The API is freely available for use at http://giant-api.princeton.edu. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Postoperative pain assessment using four behavioral scales in Pakistani children undergoing elective surgery

    PubMed Central

    Shamim, Faisal; Ullah, Hameed; Khan, Fauzia A.

    2015-01-01

    Background: Several measurement tools have been used for assessment of postoperative pain in pediatric patients. Self-report methods have limitations in younger children and parent, nurse or physician assessment can be used as a surrogate measure. These tools should be tested in different cultures as pain can be influenced by sociocultural factors. The objective was to assess the inter-rater agreement on four different behavioral pain assessment scales in our local population. Materials and Methods: This prospective, descriptive, observational study was conducted in Pakistan. American Society of Anesthesiologists I and II children, 3-7 years of age, undergoing elective surgery were enrolled. Four pain assessment scales were used, Children's Hospital of Eastern Ontario Pain Scale (CHEOPS), Toddler Preschool Postoperative Pain Scale (TPPPS), objective pain scale (OPS), and Face, Legs, Activity, Cry, Consolability (FLACC). After 15 and 60 min of arrival in the postanesthesia care unit (PACU), each child evaluated his/her postoperative pain by self-reporting and was also independently assessed by the PACU nurse, PACU anesthetist and the parent. The sensitivity and specificity of the responses of the four pain assessment scales were compared to the response of the child. Results: At 15 min, sensitivity and specificity were >60% for doctors and nurses on FLACC, OPS, and CHEOPS scales and for FLACC and CHEOPS scale for the parents. Parents showed poor agreement on OPS and TPPS. At 60 min, sensitivity was poor on the OPS scale by all three observers. Nurses showed a lower specificity on FLACC tool. Parents had poor specificity on CHEOPS and rate of false negatives was high with TPPS. Conclusions: We recommend the use of FLACC scale for assessment by parents, nurses, and doctors in Pakistani children aged between 3 and 7. PMID:25829906

  17. Using cluster ensemble and validation to identify subtypes of pervasive developmental disorders.

    PubMed

    Shen, Jess J; Lee, Phil-Hyoun; Holden, Jeanette J A; Shatkay, Hagit

    2007-10-11

    Pervasive Developmental Disorders (PDD) are neurodevelopmental disorders characterized by impairments in social interaction, communication and behavior. Given the diversity and varying severity of PDD, diagnostic tools attempt to identify homogeneous subtypes within PDD. Identifying subtypes can lead to targeted etiology studies and to effective type-specific intervention. Cluster analysis can suggest coherent subsets in data; however, different methods and assumptions lead to different results. Several previous studies applied clustering to PDD data, varying in number and characteristics of the produced subtypes. Most studies used a relatively small dataset (fewer than 150 subjects), and all applied only a single clustering method. Here we study a relatively large dataset (358 PDD patients), using an ensemble of three clustering methods. The results are evaluated using several validation methods, and consolidated through an integration step. Four clusters are identified, analyzed and compared to subtypes previously defined by the widely used diagnostic tool DSM-IV.

  18. Using Cluster Ensemble and Validation to Identify Subtypes of Pervasive Developmental Disorders

    PubMed Central

    Shen, Jess J.; Lee, Phil Hyoun; Holden, Jeanette J.A.; Shatkay, Hagit

    2007-01-01

    Pervasive Developmental Disorders (PDD) are neurodevelopmental disorders characterized by impairments in social interaction, communication and behavior.1 Given the diversity and varying severity of PDD, diagnostic tools attempt to identify homogeneous subtypes within PDD. Identifying subtypes can lead to targeted etiology studies and to effective type-specific intervention. Cluster analysis can suggest coherent subsets in data; however, different methods and assumptions lead to different results. Several previous studies applied clustering to PDD data, varying in number and characteristics of the produced subtypes19. Most studies used a relatively small dataset (fewer than 150 subjects), and all applied only a single clustering method. Here we study a relatively large dataset (358 PDD patients), using an ensemble of three clustering methods. The results are evaluated using several validation methods, and consolidated through an integration step. Four clusters are identified, analyzed and compared to subtypes previously defined by the widely used diagnostic tool DSM-IV.2 PMID:18693920

  19. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-07-31

    real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.

  20. From genomics to metabolomics, moving toward an integrated strategy for the discovery of fungal secondary metabolites.

    PubMed

    Hautbergue, T; Jamin, E L; Debrauwer, L; Puel, O; Oswald, I P

    2018-02-21

    Fungal secondary metabolites are defined by bioactive properties that ensure adaptation of the fungus to its environment. Although some of these natural products are promising sources of new lead compounds especially for the pharmaceutical industry, others pose risks to human and animal health. The identification of secondary metabolites is critical to assessing both the utility and risks of these compounds. Since fungi present biological specificities different from other microorganisms, this review covers the different strategies specifically used in fungal studies to perform this critical identification. Strategies focused on the direct detection of the secondary metabolites are firstly reported. Particularly, advances in high-throughput untargeted metabolomics have led to the generation of large datasets whose exploitation and interpretation generally require bioinformatics tools. Then, the genome-based methods used to study the entire fungal metabolic potential are reported. Transcriptomic and proteomic tools used in the discovery of fungal secondary metabolites are presented as links between genomic methods and metabolomic experiments. Finally, the influence of the culture environment on the synthesis of secondary metabolites by fungi is highlighted as a major factor to consider in research on fungal secondary metabolites. Through this review, we seek to emphasize that the discovery of natural products should integrate all of these valuable tools. Attention is also drawn to emerging technologies that will certainly revolutionize fungal research and to the use of computational tools that are necessary but whose results should be interpreted carefully.

  1. The Behavioural Profile of Psychiatric Disorders in Persons with Intellectual Disability

    ERIC Educational Resources Information Center

    Kishore, M. T.; Nizamie, S. H.; Nizamie, A.

    2005-01-01

    Background: Problems associated with psychiatric diagnoses could be minimized by identifying behavioural clusters of specific psychiatric disorders. Methods: Sixty persons with intellectual disability (ID) and behavioural problems, aged 12?55 years, were assessed with standardized Indian tools for intelligence and adaptive behaviour. Clinical…

  2. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  3. Deliver a set of tools for resolving bad inductive loops and correcting bad data.

    DOT National Transportation Integrated Search

    2012-04-01

    This project prototyped and demonstrated procedures to find and mitigate loop detector errors, and to derive more valuable data from loops. Specifically, methods were developed to find and isolate out loop data which is "bad" or invalid, so that miti...

  4. Deliver a set of tools for resolving bad inductive loops and correcting bad data

    DOT National Transportation Integrated Search

    2012-04-10

    This project prototyped and demonstrated procedures to find and mitigate loop detector errors, and to derive more valuable data from loops. Specifically, methods were developed to find and isolate out loop data which is "bad" or invalid, so that miti...

  5. Standardised Library Instruction Assessment: An Institution-Specific Approach

    ERIC Educational Resources Information Center

    Staley, Shannon M.; Branch, Nicole A.; Hewitt, Tom L.

    2010-01-01

    Introduction: We explore the use of a psychometric model for locally-relevant, information literacy assessment, using an online tool for standardised assessment of student learning during discipline-based library instruction sessions. Method: A quantitative approach to data collection and analysis was used, employing standardised multiple-choice…

  6. Quality of the parent-child interaction in young children with type 1 diabetes mellitus: study protocol.

    PubMed

    Nieuwesteeg, Anke M; Pouwer, Frans; van Bakel, Hedwig Ja; Emons, Wilco Hm; Aanstoot, Henk-Jan; Odink, Roelof; Hartman, Esther E

    2011-04-14

    In young children with type 1 diabetes mellitus (T1DM) parents have full responsibility for the diabetes-management of their child (e.g. blood glucose monitoring, and administering insulin). Behavioral tasks in childhood, such as developing autonomy, and oppositional behavior (e.g. refusing food) may interfere with the diabetes-management to achieve an optimal blood glucose control. Furthermore, higher blood glucose levels are related to more behavioral problems. So parents might need to negotiate with their child on the diabetes-management to avoid this direct negative effect. This interference, the negotiations, and the parent's responsibility for diabetes may negatively affect the quality of parent-child interaction. Nevertheless, there is little knowledge about the quality of interaction between parents and young children with T1DM, and the possible impact this may have on glycemic control and psychosocial functioning of the child. While widely used global parent-child interaction observational methods are available, there is a need for an observational tool specifically tailored to the interaction patterns of parents and children with T1DM. The main aim of this study is to construct a disease-specific observational method to assess diabetes-specific parent-child interaction. Additional aim is to explore whether the quality of parent-child interactions is associated with the glycemic control, and psychosocial functioning (resilience, behavioral problems, and quality of life). First, we will examine which situations are most suitable for observing diabetes-specific interactions. Then, these situations will be video-taped in a pilot study (N = 15). Observed behaviors are described into rating scales, with each scale describing characteristics of parent-child interactional behaviors. Next, we apply the observational tool on a larger scale for further evaluation of the instrument (N = 120). The parents are asked twice (with two years in between) to fill out questionnaires about psychosocial functioning of their child with T1DM. Furthermore, glycemic control (HbA1c) will be obtained from their medical records. A disease-specific observational tool will enable the detailed assessment of the quality of diabetes-specific parent-child interactions. The availability of such a tool will facilitate future (intervention) studies that will yield more knowledge about impact of parent-child interactions on psychosocial functioning, and glycemic control of children with T1DM.

  7. Application of bioinformatics tools and databases in microbial dehalogenation research (a review).

    PubMed

    Satpathy, R; Konkimalla, V B; Ratha, J

    2015-01-01

    Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.

  8. Evaluation of real-time PCR detection methods for detecting rice products contaminated by rice genetically modified with a CpTI-KDEL-T-nos transgenic construct.

    PubMed

    Nakamura, Kosuke; Akiyama, Hiroshi; Kawano, Noriaki; Kobayashi, Tomoko; Yoshimatsu, Kayo; Mano, Junichi; Kitta, Kazumi; Ohmori, Kiyomi; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko

    2013-12-01

    Genetically modified (GM) rice (Oryza sativa) lines, such as insecticidal Kefeng and Kemingdao, have been developed and found unauthorised in processed rice products in many countries. Therefore, qualitative detection methods for the GM rice are required for the GM food regulation. A transgenic construct for expressing cowpea (Vigna unguiculata) trypsin inhibitor (CpTI) was detected in some imported processed rice products contaminated with Kemingdao. The 3' terminal sequence of the identified transgenic construct for expression of CpTI included an endoplasmic reticulum retention signal coding sequence (KDEL) and nopaline synthase terminator (T-nos). The sequence was identical to that in a report on Kefeng. A novel construct-specific real-time polymerase chain reaction (PCR) detection method for detecting the junction region sequence between the CpTI-KDEL and T-nos was developed. The imported processed rice products were evaluated for the contamination of the GM rice using the developed construct-specific real-time PCR methods, and detection frequency was compared with five event-specific detection methods. The construct-specific detection methods detected the GM rice at higher frequency than the event-specific detection methods. Therefore, we propose that the construct-specific detection method is a beneficial tool for screening the contamination of GM rice lines, such as Kefeng, in processed rice products for the GM food regulation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Nonlinear Shaping Architecture Designed with Using Evolutionary Structural Optimization Tools

    NASA Astrophysics Data System (ADS)

    Januszkiewicz, Krystyna; Banachowicz, Marta

    2017-10-01

    The paper explores the possibilities of using Structural Optimization Tools (ESO) digital tools in an integrated structural and architectural design in response to the current needs geared towards sustainability, combining ecological and economic efficiency. The first part of the paper defines the Evolutionary Structural Optimization tools, which were developed specifically for engineering purposes using finite element analysis as a framework. The development of ESO has led to several incarnations, which are all briefly discussed (Additive ESO, Bi-directional ESO, Extended ESO). The second part presents result of using these tools in structural and architectural design. Actual building projects which involve optimization as a part of the original design process will be presented (Crematorium in Kakamigahara Gifu, Japan, 2006 SANAA“s Learning Centre, EPFL in Lausanne, Switzerland 2008 among others). The conclusion emphasizes that the structural engineering and architectural design mean directing attention to the solutions which are used by Nature, designing works optimally shaped and forming their own environments. Architectural forms never constitute the optimum shape derived through a form-finding process driven only by structural optimization, but rather embody and integrate a multitude of parameters. It might be assumed that there is a similarity between these processes in nature and the presented design methods. Contemporary digital methods make the simulation of such processes possible, and thus enable us to refer back to the empirical methods of previous generations.

  10. Accuracy of Nutritional Screening Tools in Assessing the Risk of Undernutrition in Hospitalized Children.

    PubMed

    Huysentruyt, Koen; Devreker, Thierry; Dejonckheere, Joachim; De Schepper, Jean; Vandenplas, Yvan; Cools, Filip

    2015-08-01

    The aim of the present study was to evaluate the predictive accuracy of screening tools for assessing nutritional risk in hospitalized children in developed countries. The study involved a systematic review of literature (MEDLINE, EMBASE, and Cochrane Central databases up to January 17, 2014) of studies on the diagnostic performance of pediatric nutritional screening tools. Methodological quality was assessed using a modified QUADAS tool. Sensitivity and specificity were calculated for each screening tool per validation method. A meta-analysis was performed to estimate the risk ratio of different screening result categories of being truly at nutritional risk. A total of 11 studies were included on ≥1 of the following screening tools: Pediatric Nutritional Risk Score, Screening Tool for the Assessment of Malnutrition in Paediatrics, Paediatric Yorkhill Malnutrition Score, and Screening Tool for Risk on Nutritional Status and Growth. Because of variation in reference standards, a direct comparison of the predictive accuracy of the screening tools was not possible. A meta-analysis was performed on 1629 children from 7 different studies. The risk ratio of being truly at nutritional risk was 0.349 (95% confidence interval [CI] 0.16-0.78) for children in the low versus moderate screening category and 0.292 (95% CI 0.19-0.44) in the moderate versus high screening category. There is insufficient evidence to choose 1 nutritional screening tool over another based on their predictive accuracy. The estimated risk of being at "true nutritional risk" increases with each category of screening test result. Each screening category should be linked to a specific course of action, although further research is needed.

  11. IgG responses to the gSG6-P1 salivary peptide for evaluating human exposure to Anopheles bites in urban areas of Dakar region, Sénégal

    PubMed Central

    2012-01-01

    Background Urban malaria can be a serious public health problem in Africa. Human-landing catches of mosquitoes, a standard entomological method to assess human exposure to malaria vector bites, can lack sensitivity in areas where exposure is low. A simple and highly sensitive tool could be a complementary indicator for evaluating malaria exposure in such epidemiological contexts. The human antibody response to the specific Anopheles gSG6-P1 salivary peptide have been described as an adequate tool biomarker for a reliable assessment of human exposure level to Anopheles bites. The aim of this study was to use this biomarker to evaluate the human exposure to Anopheles mosquito bites in urban settings of Dakar (Senegal), one of the largest cities in West Africa, where Anopheles biting rates and malaria transmission are supposed to be low. Methods One cross-sectional study concerning 1,010 (505 households) children (n = 505) and adults (n = 505) living in 16 districts of downtown Dakar and its suburbs was performed from October to December 2008. The IgG responses to gSG6-P1 peptide have been assessed and compared to entomological data obtained in or near the same district. Results Considerable individual variations in anti-gSG6-P1 IgG levels were observed between and within districts. In spite of this individual heterogeneity, the median level of specific IgG and the percentage of immune responders differed significantly between districts. A positive and significant association was observed between the exposure levels to Anopheles gambiae bites, estimated by classical entomological methods, and the median IgG levels or the percentage of immune responders measuring the contact between human populations and Anopheles mosquitoes. Interestingly, immunological parameters seemed to better discriminate the exposure level to Anopheles bites between different exposure groups of districts. Conclusions Specific human IgG responses to gSG6-P1 peptide biomarker represent, at the population and individual levels, a credible new alternative tool to assess accurately the heterogeneity of exposure level to Anopheles bites and malaria risk in low urban transmission areas. The development of such biomarker tool would be particularly relevant for mapping and monitoring malaria risk and for measuring the efficiency of vector control strategies in these specific settings. PMID:22424570

  12. Biosensors for spatiotemporal detection of reactive oxygen species in cells and tissues.

    PubMed

    Erard, Marie; Dupré-Crochet, Sophie; Nüße, Oliver

    2018-05-01

    Redox biology has become a major issue in numerous areas of physiology. Reactive oxygen species (ROS) have a broad range of roles from signal transduction to growth control and cell death. To understand the nature of these roles, accurate measurement of the reactive compounds is required. An increasing number of tools for ROS detection is available; however, the specificity and sensitivity of these tools are often insufficient. Furthermore, their specificity has been rarely evaluated in complex physiological conditions. Many ROS probes are sensitive to environmental conditions in particular pH, which may interfere with ROS detection and cause misleading results. Accurate detection of ROS in physiology and pathophysiology faces additional challenges concerning the precise localization of the ROS and the timing of their production and disappearance. Certain ROS are membrane permeable, and certain ROS probes move across cells and organelles. Targetable ROS probes such as fluorescent protein-based biosensors are required for accurate localization. Here we analyze these challenges in more detail, provide indications on the strength and weakness of current tools for ROS detection, and point out developments that will provide improved ROS detection methods in the future. There is no universal method that fits all situations in physiology and cell biology. A detailed knowledge of the ROS probes is required to choose the appropriate method for a given biological problem. The knowledge of the shortcomings of these probes should also guide the development of new sensors.

  13. Developing tools for the safety specification in risk management plans: lessons learned from a pilot project.

    PubMed

    Cooper, Andrew J P; Lettis, Sally; Chapman, Charlotte L; Evans, Stephen J W; Waller, Patrick C; Shakir, Saad; Payvandi, Nassrin; Murray, Alison B

    2008-05-01

    Following the adoption of the ICH E2E guideline, risk management plans (RMP) defining the cumulative safety experience and identifying limitations in safety information are now required for marketing authorisation applications (MAA). A collaborative research project was conducted to gain experience with tools for presenting and evaluating data in the safety specification. This paper presents those tools found to be useful and the lessons learned from their use. Archive data from a successful MAA were utilised. Methods were assessed for demonstrating the extent of clinical safety experience, evaluating the sensitivity of the clinical trial data to detect treatment differences and identifying safety signals from adverse event and laboratory data to define the extent of safety knowledge with the drug. The extent of clinical safety experience was demonstrated by plots of patient exposure over time. Adverse event data were presented using dot plots, which display the percentages of patients with the events of interest, the odds ratio, and 95% confidence interval. Power and confidence interval plots were utilised for evaluating the sensitivity of the clinical database to detect treatment differences. Box and whisker plots were used to display laboratory data. This project enabled us to identify new evidence-based methods for presenting and evaluating clinical safety data. These methods represent an advance in the way safety data from clinical trials can be analysed and presented. This project emphasises the importance of early and comprehensive planning of the safety package, including evaluation of the use of epidemiology data.

  14. AncestrySNPminer: A bioinformatics tool to retrieve and develop ancestry informative SNP panels

    PubMed Central

    Amirisetty, Sushil; Khurana Hershey, Gurjit K.; Baye, Tesfaye M.

    2012-01-01

    A wealth of genomic information is available in public and private databases. However, this information is underutilized for uncovering population specific and functionally relevant markers underlying complex human traits. Given the huge amount of SNP data available from the annotation of human genetic variation, data mining is a faster and cost effective approach for investigating the number of SNPs that are informative for ancestry. In this study, we present AncestrySNPminer, the first web-based bioinformatics tool specifically designed to retrieve Ancestry Informative Markers (AIMs) from genomic data sets and link these informative markers to genes and ontological annotation classes. The tool includes an automated and simple “scripting at the click of a button” functionality that enables researchers to perform various population genomics statistical analyses methods with user friendly querying and filtering of data sets across various populations through a single web interface. AncestrySNPminer can be freely accessed at https://research.cchmc.org/mershalab/AncestrySNPminer/login.php. PMID:22584067

  15. CHOgenome.org 2.0: Genome resources and website updates.

    PubMed

    Kremkow, Benjamin G; Baik, Jong Youn; MacDonald, Madolyn L; Lee, Kelvin H

    2015-07-01

    Chinese hamster ovary (CHO) cells are a major host cell line for the production of therapeutic proteins, and CHO cell and Chinese hamster (CH) genomes have recently been sequenced using next-generation sequencing methods. CHOgenome.org was launched in 2011 (version 1.0) to serve as a database repository and to provide bioinformatics tools for the CHO community. CHOgenome.org (version 1.0) maintained GenBank CHO-K1 genome data, identified CHO-omics literature, and provided a CHO-specific BLAST service. Recent major updates to CHOgenome.org (version 2.0) include new sequence and annotation databases for both CHO and CH genomes, a more user-friendly website, and new research tools, including a proteome browser and a genome viewer. CHO cell-line specific sequences and annotations facilitate cell line development opportunities, several of which are discussed. Moving forward, CHOgenome.org will host the increasing amount of CHO-omics data and continue to make useful bioinformatics tools available to the CHO community. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  17. Content and functional specifications for a standards-based multidisciplinary rounding tool to maintain continuity across acute and critical care

    PubMed Central

    Collins, Sarah; Hurley, Ann C; Chang, Frank Y; Illa, Anisha R; Benoit, Angela; Laperle, Sarah; Dykes, Patricia C

    2014-01-01

    Background Maintaining continuity of care (CoC) in the inpatient setting is dependent on aligning goals and tasks with the plan of care (POC) during multidisciplinary rounds (MDRs). A number of locally developed rounding tools exist, yet there is a lack of standard content and functional specifications for electronic tools to support MDRs within and across settings. Objective To identify content and functional requirements for an MDR tool to support CoC. Materials and methods We collected discrete clinical data elements (CDEs) discussed during rounds for 128 acute and critical care patients. To capture CDEs, we developed and validated an iPad-based observational tool based on informatics CoC standards. We observed 19 days of rounds and conducted eight group and individual interviews. Descriptive and bivariate statistics and network visualization were conducted to understand associations between CDEs discussed during rounds with a particular focus on the POC. Qualitative data were thematically analyzed. All analyses were triangulated. Results We identified the need for universal and configurable MDR tool views across settings and users and the provision of messaging capability. Eleven empirically derived universal CDEs were identified, including four POC CDEs: problems, plan, goals, and short-term concerns. Configurable POC CDEs were: rationale, tasks/‘to dos’, pending results and procedures, discharge planning, patient preferences, need for urgent review, prognosis, and advice/guidance. Discussion Some requirements differed between settings; yet, there was overlap between POC CDEs. Conclusions We recommend an initial list of 11 universal CDEs for continuity in MDRs across settings and 27 CDEs that can be configured to meet setting-specific needs. PMID:24081019

  18. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  19. Quokka: a comprehensive tool for rapid and accurate prediction of kinase family-specific phosphorylation sites in the human proteome.

    PubMed

    Li, Fuyi; Li, Chen; Marquez-Lago, Tatiana T; Leier, André; Akutsu, Tatsuya; Purcell, Anthony W; Smith, A Ian; Lithgow, Trevor; Daly, Roger J; Song, Jiangning; Chou, Kuo-Chen

    2018-06-27

    Kinase-regulated phosphorylation is a ubiquitous type of post-translational modification (PTM) in both eukaryotic and prokaryotic cells. Phosphorylation plays fundamental roles in many signalling pathways and biological processes, such as protein degradation and protein-protein interactions. Experimental studies have revealed that signalling defects caused by aberrant phosphorylation are highly associated with a variety of human diseases, especially cancers. In light of this, a number of computational methods aiming to accurately predict protein kinase family-specific or kinase-specific phosphorylation sites have been established, thereby facilitating phosphoproteomic data analysis. In this work, we present Quokka, a novel bioinformatics tool that allows users to rapidly and accurately identify human kinase family-regulated phosphorylation sites. Quokka was developed by using a variety of sequence scoring functions combined with an optimized logistic regression algorithm. We evaluated Quokka based on well-prepared up-to-date benchmark and independent test datasets, curated from the Phospho.ELM and UniProt databases, respectively. The independent test demonstrates that Quokka improves the prediction performance compared with state-of-the-art computational tools for phosphorylation prediction. In summary, our tool provides users with high-quality predicted human phosphorylation sites for hypothesis generation and biological validation. The Quokka webserver and datasets are freely available at http://quokka.erc.monash.edu/. Supplementary data are available at Bioinformatics online.

  20. Evaluation of Time Domain EM Coupling Techniques. Volume II.

    DTIC Science & Technology

    1980-08-01

    tool for the analysis of elec- tromangetic coupling and shielding problems: the finite-difference, time-domain (FD- TD ) solution of Maxwell’s equations...The objective of the program was to evaluate the suitability of the FD- TD method to determine the amount of electromagnetic coupling through an...specific questfiowwere addressed during this program: 1. Can the FD- TD method accurately model electromagnetic coupling into a conducting structure for

  1. Wire EDM for Refractory Materials

    NASA Technical Reports Server (NTRS)

    Zellars, G. R.; Harris, F. E.; Lowell, C. E.; Pollman, W. M.; Rys, V. J.; Wills, R. J.

    1982-01-01

    In an attempt to reduce fabrication time and costs, Wire Electrical Discharge Machine (Wire EDM) method was investigated as tool for fabricating matched blade roots and disk slots. Eight high-strength nickel-base superalloys were used. Computer-controlled Wire EDM technique provided high quality surfaces with excellent dimensional tolerances. Wire EDM method offers potential for substantial reductions in fabrication costs for "hard to machine" alloys and electrically conductive materials in specific high-precision applications.

  2. An overview of health forecasting.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D

    2013-01-01

    Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.

  3. The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.

    PubMed

    Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A

    2010-03-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).

  4. The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software

    PubMed Central

    Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung

    2010-01-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162

  5. A patient-centered electronic tool for weight loss outcomes after Roux-en-Y gastric bypass.

    PubMed

    Wood, G Craig; Benotti, Peter; Gerhard, Glenn S; Miller, Elaina K; Zhang, Yushan; Zaccone, Richard J; Argyropoulos, George A; Petrick, Anthony T; Still, Christopher D

    2014-01-01

    BACKGROUND. Current patient education and informed consent regarding weight loss expectations for bariatric surgery candidates are largely based on averages from large patient cohorts. The variation in weight loss outcomes illustrates the need for establishing more realistic weight loss goals for individual patients. This study was designed to develop a simple web-based tool which provides patient-specific weight loss expectations. METHODS. Postoperative weight measurements after Roux-en-Y gastric bypass (RYGB) were collected and analyzed with patient characteristics known to influence weight loss outcomes. Quantile regression was used to create expected weight loss curves (25th, 50th, and 75th %tile) for the 24 months after RYGB. The resulting equations were validated and used to develop web-based tool for predicting weight loss outcomes. RESULTS. Weight loss data from 2986 patients (2608 in the primary cohort and 378 in the validation cohort) were included. Preoperative body mass index (BMI) and age were found to have a high correlation with weight loss accomplishment (P < 0.0001 for each). An electronic tool was created that provides easy access to patient-specific, 24-month weight loss trajectories based on initial BMI and age. CONCLUSIONS. This validated, patient-centered electronic tool will assist patients and providers in patient teaching, informed consent, and postoperative weight loss management.

  6. Nested PCR Assay for Eight Pathogens: A Rapid Tool for Diagnosis of Bacterial Meningitis.

    PubMed

    Bhagchandani, Sharda P; Kubade, Sushant; Nikhare, Priyanka P; Manke, Sonali; Chandak, Nitin H; Kabra, Dinesh; Baheti, Neeraj N; Agrawal, Vijay S; Sarda, Pankaj; Mahajan, Parikshit; Ganjre, Ashish; Purohit, Hemant J; Singh, Lokendra; Taori, Girdhar M; Daginawala, Hatim F; Kashyap, Rajpal S

    2016-02-01

    Bacterial meningitis is a dreadful infectious disease with a high mortality and morbidity if remained undiagnosed. Traditional diagnostic methods for bacterial meningitis pose a challenge in accurate identification of pathogen, making prognosis difficult. The present study is therefore aimed to design and evaluate a specific and sensitive nested 16S rDNA genus-based polymerase chain reaction (PCR) assay using clinical cerebrospinal fluid (CSF) for rapid diagnosis of eight pathogens causing the disease. The present work was dedicated to development of an in-house genus specific 16S rDNA nested PCR covering pathogens of eight genera responsible for causing bacterial meningitis using newly designed as well as literature based primers for respective genus. A total 150 suspected meningitis CSF obtained from the patients admitted to Central India Institute of Medical Sciences (CIIMS), India during the period from August 2011 to May 2014, were used to evaluate clinical sensitivity and clinical specificity of optimized PCR assays. The analytical sensitivity and specificity of our newly designed genus-specific 16S rDNA PCR were found to be ≥92%. With such a high sensitivity and specificity, our in-house nested PCR was able to give 100% sensitivity in clinically confirmed positive cases and 100% specificity in clinically confirmed negative cases indicating its applicability in clinical diagnosis. Our in-house nested PCR system therefore can diagnose the accurate pathogen causing bacterial meningitis and therefore be useful in selecting a specific treatment line to minimize morbidity. Results are obtained within 24 h and high sensitivity makes this nested PCR assay a rapid and accurate diagnostic tool compared to traditional culture-based methods.

  7. Occurrence of and Sequence Variation among F-Specific RNA Bacteriophage Subgroups in Feces and Wastewater of Urban and Animal Origins

    PubMed Central

    Hartard, C.; Rivet, R.; Banas, S.

    2015-01-01

    F-specific RNA bacteriophages (FRNAPH) have been widely studied as tools for evaluating fecal or viral pollution in water. It has also been proposed that they can be used to differentiate human from animal fecal contamination. While FRNAPH subgroup I (FRNAPH-I) and FRNAPH-IV are often associated with animal pollution, FRNAPH-II and -III prevail in human wastewater. However, this distribution is not absolute, and variable survival rates in these subgroups lead to misinterpretation of the original distribution. In this context, we studied FRNAPH distribution in urban wastewater and animal feces/wastewater. To increase the specificity, we partially sequenced the genomes of phages of urban and animal origins. The persistence of the genomes and infectivity were also studied, over time in wastewater and during treatment, for each subgroup. FRNAPH-I genome sequences did not show any specific urban or animal clusters to allow development of molecular tools for differentiation. They were the most resistant and as such may be used as fecal or viral indicators. FRNAPH-II's low prevalence and low sequence variability in animal stools, combined with specific clusters formed by urban strains, allowed differentiation between urban and animal pollution by using a specific reverse transcription-PCR (RT-PCR) method. The subgroup's resistance over time was comparable to that of FRNAPH-I, but its surface properties allowed higher elimination rates during activated-sludge treatment. FRNAPH-III's low sequence variability in animal wastewater and specific cluster formation by urban strains also allowed differentiation by using a specific RT-PCR method. Nevertheless, its low resistance restricted it to being used only for recent urban pollution detection. FRNAPH-IV was too rare to be used. PMID:26162878

  8. Occurrence of and Sequence Variation among F-Specific RNA Bacteriophage Subgroups in Feces and Wastewater of Urban and Animal Origins.

    PubMed

    Hartard, C; Rivet, R; Banas, S; Gantzer, C

    2015-09-01

    F-specific RNA bacteriophages (FRNAPH) have been widely studied as tools for evaluating fecal or viral pollution in water. It has also been proposed that they can be used to differentiate human from animal fecal contamination. While FRNAPH subgroup I (FRNAPH-I) and FRNAPH-IV are often associated with animal pollution, FRNAPH-II and -III prevail in human wastewater. However, this distribution is not absolute, and variable survival rates in these subgroups lead to misinterpretation of the original distribution. In this context, we studied FRNAPH distribution in urban wastewater and animal feces/wastewater. To increase the specificity, we partially sequenced the genomes of phages of urban and animal origins. The persistence of the genomes and infectivity were also studied, over time in wastewater and during treatment, for each subgroup. FRNAPH-I genome sequences did not show any specific urban or animal clusters to allow development of molecular tools for differentiation. They were the most resistant and as such may be used as fecal or viral indicators. FRNAPH-II's low prevalence and low sequence variability in animal stools, combined with specific clusters formed by urban strains, allowed differentiation between urban and animal pollution by using a specific reverse transcription-PCR (RT-PCR) method. The subgroup's resistance over time was comparable to that of FRNAPH-I, but its surface properties allowed higher elimination rates during activated-sludge treatment. FRNAPH-III's low sequence variability in animal wastewater and specific cluster formation by urban strains also allowed differentiation by using a specific RT-PCR method. Nevertheless, its low resistance restricted it to being used only for recent urban pollution detection. FRNAPH-IV was too rare to be used. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  9. Survey of Non-Rigid Registration Tools in Medicine.

    PubMed

    Keszei, András P; Berkels, Benjamin; Deserno, Thomas M

    2017-02-01

    We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.

  10. Application of stable isotope tools for evaluating natural and stimulated biodegradation of organic pollutants in field studies.

    PubMed

    Fischer, Anko; Manefield, Mike; Bombach, Petra

    2016-10-01

    Stable isotope tools are increasingly applied for in-depth evaluation of biodegradation of organic pollutants at contaminated field sites. They can be divided into three methods i) determination of changes in natural abundance of stable isotopes using compound-specific stable isotope analysis (CSIA), ii) detection of incorporation of stable-isotope label from a stable-isotope labelled target compound into degradation and/or mineralisation products and iii) determination of stable-isotope label incorporation into biomarkers using stable isotope probing (SIP). Stable isotope tools have been applied as key monitoring tools for multiple-line-of-evidence-approaches (MLEA) for sensitive evaluation of pollutant biodegradation. This review highlights the application of CSIA, SIP and MLEA including stable isotope tools for assessing natural and stimulated biodegradation of organic pollutants in field studies dealing with soil and groundwater contaminations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  12. Evaluating the integration of cultural competence skills into health and physical assessment tools: a survey of Canadian schools of nursing.

    PubMed

    Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie

    2013-04-01

    Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.

  13. Measurement of breast volume using body scan technology(computer-aided anthropometry).

    PubMed

    Veitch, Daisy; Burford, Karen; Dench, Phil; Dean, Nicola; Griffin, Philip

    2012-01-01

    Assessment of breast volume is an important tool for preoperative planning in various breast surgeries and other applications, such as bra development. Accurate assessment can improve the consistency and quality of surgery outcomes. This study outlines a non-invasive method to measure breast volume using a whole body 3D laser surface anatomy scanner, the Cyberware WBX. It expands on a previous publication where this method was validated against patients undergoing mastectomy. It specifically outlines and expands the computer-aided anthropometric (CAA) method for extracting breast volumes in a non-invasive way from patients enrolled in a breast reduction study at Flinders Medical Centre, South Australia. This step-by-step description allows others to replicate this work and provides an additional tool to assist them in their own clinical practice and development of designs.

  14. The PDS4 Data Dictionary Tool - Metadata Design for Data Preparers

    NASA Astrophysics Data System (ADS)

    Raugh, A.; Hughes, J. S.

    2017-12-01

    One of the major design goals of the PDS4 development effort was to create an extendable Information Model (IM) for the archive, and to allow mission data designers/preparers to create extensions for metadata definitions specific to their own contexts. This capability is critical for the Planetary Data System - an archive that deals with a data collection that is diverse along virtually every conceivable axis. Amid such diversity in the data itself, it is in the best interests of the PDS archive and its users that all extensions to the IM follow the same design techniques, conventions, and restrictions as the core implementation itself. But it is unrealistic to expect mission data designers to acquire expertise in information modeling, model-driven design, ontology, schema formulation, and PDS4 design conventions and philosophy in order to define their own metadata. To bridge that expertise gap and bring the power of information modeling to the data label designer, the PDS Engineering Node has developed the data dictionary creation tool known as "LDDTool". This tool incorporates the same software used to maintain and extend the core IM, packaged with an interface that enables a developer to create his extension to the IM using the same, standards-based metadata framework PDS itself uses. Through this interface, the novice dictionary developer has immediate access to the common set of data types and unit classes for defining attributes, and a straight-forward method for constructing classes. The more experienced developer, using the same tool, has access to more sophisticated modeling methods like abstraction and extension, and can define context-specific validation rules. We present the key features of the PDS Local Data Dictionary Tool, which both supports the development of extensions to the PDS4 IM, and ensures their compatibility with the IM.

  15. A simple SDS-PAGE protein pattern from pitcher secretions as a new tool to distinguish Nepenthes species (Nepenthaceae).

    PubMed

    Biteau, Flore; Nisse, Estelle; Miguel, Sissi; Hannewald, Paul; Bazile, Vincent; Gaume, Laurence; Mignard, Benoit; Hehn, Alain; Bourgaud, Frederic

    2013-12-01

    Carnivorous plants have always fascinated scientists because these plants are able to attract, capture, and digest animal prey using their remarkable traps that contain digestive secretions. Nepenthes is one of the largest genera of carnivorous plants, with 120 species described thus far. Despite an outstanding diversity of trap designs, many species are often confused with each other and remain difficult to classify because they resemble pitchers or of the occurrence of interspecific hybrids. Here, we propose a new method to easily distinguish Nepenthes species based on a SDS PAGE protein pattern analysis of their pitcher secretions. Intraspecific comparisons were performed among specimens growing in different environmental conditions to ascertain the robustness of this method. Our results show that, at the juvenile stage and in the absence of prey in the pitcher, an examined species is characterized by a specific and stable profile, whatever the environmental conditions. The method we describe here can be used as a reliable tool to easily distinguish between Nepenthes species and to help with potential identification based on the species-specific protein pattern of their pitcher secretions, which is complementary to the monograph information.

  16. Simplified, inverse, ejector design tool

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1993-01-01

    A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.

  17. Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.

    PubMed

    DiMaio, Frank

    2017-01-01

    Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.

  18. Developing Decontamination Tools and Approaches to ...

    EPA Pesticide Factsheets

    Developing Decontamination Tools and Approaches to Address Indoor Pesticide Contamination from Improper Bed Bug Treatments The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  19. Observation weights unlock bulk RNA-seq tools for zero inflation and single-cell applications.

    PubMed

    Van den Berge, Koen; Perraudeau, Fanny; Soneson, Charlotte; Love, Michael I; Risso, Davide; Vert, Jean-Philippe; Robinson, Mark D; Dudoit, Sandrine; Clement, Lieven

    2018-02-26

    Dropout events in single-cell RNA sequencing (scRNA-seq) cause many transcripts to go undetected and induce an excess of zero read counts, leading to power issues in differential expression (DE) analysis. This has triggered the development of bespoke scRNA-seq DE methods to cope with zero inflation. Recent evaluations, however, have shown that dedicated scRNA-seq tools provide no advantage compared to traditional bulk RNA-seq tools. We introduce a weighting strategy, based on a zero-inflated negative binomial model, that identifies excess zero counts and generates gene- and cell-specific weights to unlock bulk RNA-seq DE pipelines for zero-inflated data, boosting performance for scRNA-seq.

  20. Molecular beacon sequence design algorithm.

    PubMed

    Monroe, W Todd; Haselton, Frederick R

    2003-01-01

    A method based on Web-based tools is presented to design optimally functioning molecular beacons. Molecular beacons, fluorogenic hybridization probes, are a powerful tool for the rapid and specific detection of a particular nucleic acid sequence. However, their synthesis costs can be considerable. Since molecular beacon performance is based on its sequence, it is imperative to rationally design an optimal sequence before synthesis. The algorithm presented here uses simple Microsoft Excel formulas and macros to rank candidate sequences. This analysis is carried out using mfold structural predictions along with other free Web-based tools. For smaller laboratories where molecular beacons are not the focus of research, the public domain algorithm described here may be usefully employed to aid in molecular beacon design.

  1. Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.

    PubMed

    Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles

    2004-01-01

    The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.

  2. Hypermedia and Vocabulary Acquisition for Second Language

    ERIC Educational Resources Information Center

    Meli, Rocio

    2009-01-01

    The purpose of this study was to examine the impact of multimedia as a delivery tool for enhancing vocabulary in second-language classrooms. The mixed method design focused on specific techniques to help students acquire Spanish vocabulary and communication skills. The theoretical framework for this study consisted of second language theories…

  3. Random Item Generation Is Affected by Age

    ERIC Educational Resources Information Center

    Multani, Namita; Rudzicz, Frank; Wong, Wing Yiu Stephanie; Namasivayam, Aravind Kumar; van Lieshout, Pascal

    2016-01-01

    Purpose: Random item generation (RIG) involves central executive functioning. Measuring aspects of random sequences can therefore provide a simple method to complement other tools for cognitive assessment. We examine the extent to which RIG relates to specific measures of cognitive function, and whether those measures can be estimated using RIG…

  4. Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering

    ERIC Educational Resources Information Center

    Rosca, Daniela

    2005-01-01

    The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…

  5. Understanding the Effects of Infrastructure Changes on Subpopulations: Survey of Current Methods, Models, and Tools

    DTIC Science & Technology

    2016-04-01

    key leaders, government services, and businesses, while the cultural/historical/religious focuses on specific cultural sites. These cards were...labeled “intensive,” because these impacts had severe but localized impacts (Figure 21). Environmental impacts (such as raw sewage dumping following

  6. Vector coding of wavelet-transformed images

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Zhi, Cheng; Zhou, Yuanhua

    1998-09-01

    Wavelet, as a brand new tool in signal processing, has got broad recognition. Using wavelet transform, we can get octave divided frequency band with specific orientation which combines well with the properties of Human Visual System. In this paper, we discuss the classified vector quantization method for multiresolution represented image.

  7. Watching What We Say: Using Video to Learn about Discussions

    ERIC Educational Resources Information Center

    Basmadjian, Kevin G.

    2008-01-01

    This article considers the benefits and challenges of using English teacher candidates' videotaped discussions of literature as tools to facilitate authentic and engaging discussions of literature. More specifically, this article examines the use of teacher candidates' videotaped discussions in a secondary English methods course to expand…

  8. Strategies for Teaching Fractions: Using Error Analysis for Intervention and Assessment

    ERIC Educational Resources Information Center

    Spangler, David B.

    2011-01-01

    Many students struggle with fractions and must understand them before learning higher-level math. Veteran educator David B. Spangler provides research-based tools that are aligned with NCTM and Common Core State Standards. He outlines powerful diagnostic methods for analyzing student work and providing timely, specific, and meaningful…

  9. How to Engage Medical Students in Chronobiology: An Example on Autorhythmometry

    ERIC Educational Resources Information Center

    Rol de Lama, M. A.; Lozano, J. P.; Ortiz, V.; Sanchez-Vazquez, F. J.; Madrid, J. A.

    2005-01-01

    This contribution describes a new laboratory experience that improves medical students' learning of chronobiology by introducing them to basic chronobiology concepts as well as to methods and statistical analysis tools specific for circadian rhythms. We designed an autorhythmometry laboratory session where students simultaneously played the role…

  10. Training Programs for Observers of Behavior; A Review.

    ERIC Educational Resources Information Center

    Spool, Mark D.

    1978-01-01

    This review covers the past 25 years of research literature on training observers of behavior, specifically in the areas of interviewing, reducing rater bias, interpersonal perception and observation as a research tool. The focus is on determining the most successful training methods and their theoretical bases. (Author/SJL)

  11. Usage-Based Collection Evaluation with a Curricular Focus

    ERIC Educational Resources Information Center

    Kohn, Karen C.

    2013-01-01

    Systematic evaluation of a library's collection can be a useful tool for collection development. After reviewing three evaluation methods and their usefulness for our small academic library, I undertook a usage-based evaluation, focusing on narrow segments of our collection that served specific undergraduate courses. For each section, I collected…

  12. Pig Mandible as a Valuable Tool to Improve Periodontal Surgery Techniques

    ERIC Educational Resources Information Center

    Zangrando, Mariana S. Ragghianti; Sant'Ana, Adriana C. P.; Greghi, Sebastião L. A.; de Rezende, Maria Lucia R.; Damante, Carla A.

    2014-01-01

    Clinical education in dental practice is a challenge for professionals and students. The traditional method of clinical training in Periodontology usually is based on following the procedure and practicing under supervision, until achieving proficiency. However, laboratory practice is required before direct care in patients. Specific anatomic…

  13. Prospective performance evaluation of selected common virtual screening tools. Case study: Cyclooxygenase (COX) 1 and 2.

    PubMed

    Kaserer, Teresa; Temml, Veronika; Kutil, Zsofia; Vanek, Tomas; Landa, Premysl; Schuster, Daniela

    2015-01-01

    Computational methods can be applied in drug development for the identification of novel lead candidates, but also for the prediction of pharmacokinetic properties and potential adverse effects, thereby aiding to prioritize and identify the most promising compounds. In principle, several techniques are available for this purpose, however, which one is the most suitable for a specific research objective still requires further investigation. Within this study, the performance of several programs, representing common virtual screening methods, was compared in a prospective manner. First, we selected top-ranked virtual screening hits from the three methods pharmacophore modeling, shape-based modeling, and docking. For comparison, these hits were then additionally predicted by external pharmacophore- and 2D similarity-based bioactivity profiling tools. Subsequently, the biological activities of the selected hits were assessed in vitro, which allowed for evaluating and comparing the prospective performance of the applied tools. Although all methods performed well, considerable differences were observed concerning hit rates, true positive and true negative hits, and hitlist composition. Our results suggest that a rational selection of the applied method represents a powerful strategy to maximize the success of a research project, tightly linked to its aims. We employed cyclooxygenase as application example, however, the focus of this study lied on highlighting the differences in the virtual screening tool performances and not in the identification of novel COX-inhibitors. Copyright © 2015 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  14. Distribution of contact loads over the flank-land of the cutter with a rounded cutting edge

    NASA Astrophysics Data System (ADS)

    Kozlov, V.; Gerasimov, A.; Kim, A.

    2016-04-01

    In this paper, contact conditions between a tool and a workpiece material for wear-simulating turning by a cutter with a sharp-cornered edge and with a rounded cutting edge are analysed. The results of the experimental study of specific contact load distribution over the artificial flank wear-land of the cutter in free orthogonal turning of the disk from titanium alloy (Ti6Al2Mo2Cr), ductile (63Cu) and brittle (57Cu1Al3Mn) brasses are described. Investigations were carried out by the method of ‘split cutter’ and by the method of the artificial flank-land of variable width. The experiments with a variable feed rate and a cutting speed show that in titanium alloy machining with a sharp-cornered cutting edge the highest normal contact load (σh max = 3400…2200 MPa) is observed immediately at the cutting edge, and the curve has a horizontal region with the length of 0.2… 0.6 mm. At a distance from the cutting edge, the value of specific normal contact load is dramatically reduced to 1100…500 MPa. The character of normal contact load for a rounded cutting edge is different -it is uniform, and its value is approximately 2 times smaller compared to machining with a sharp-cornered cutting edge. In author’s opinion it is connected with generation of a seizure zone in a chip formation region and explains the capacity of highly worn-out cutting tools for titanium alloys machining. The paper analyses the distribution of tangential contact loads over the flank land, which pattern differs considerably for machining with a sharp-cornered edge and with a rounded cutting edge. Abbreviation and symbols: m/s - meter per second (cutting speed v); mm/r - millimeter per revolution (feed rate f); MPa - mega Pascal (specific contact load as a stress σ or τ) hf - the width of the flank wear land (chamfer) of the cutting tool, flank wear land can be natural or artificial like the one in this paper [mm]; xh - distance from the cutting edge on the surface of the flank-land [mm]; σh - normal specific contact load on the flank land [MPa]; τh - tangential (shear) specific contact load on the flank land [MPa]; HSS - high speed steel (material of cutting tool); Py - radial component of cutting force [N]; Py r - radial component of cutting force on the rake face [N]; Pz - tangential component of cutting force [N]; γ - rake angle of the cutting tool [°] α - clearance angle of the sharp cutting tool [°] αh - clearance angle of the flank wear land [°] ρ - rounding off radius of the cutting edge [mm]; b - width of the machined disk [mm].

  15. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less

  16. KISS for STRAP: user extensions for a protein alignment editor.

    PubMed

    Gille, Christoph; Lorenzen, Stephan; Michalsky, Elke; Frömmel, Cornelius

    2003-12-12

    The Structural Alignment Program STRAP is a comfortable comprehensive editor and analyzing tool for protein alignments. A wide range of functions related to protein sequences and protein structures are accessible with an intuitive graphical interface. Recent features include mapping of mutations and polymorphisms onto structures and production of high quality figures for publication. Here we address the general problem of multi-purpose program packages to keep up with the rapid development of bioinformatical methods and the demand for specific program functions. STRAP was remade implementing a novel design which aims at Keeping Interfaces in STRAP Simple (KISS). KISS renders STRAP extendable to bio-scientists as well as to bio-informaticians. Scientists with basic computer skills are capable of implementing statistical methods or embedding existing bioinformatical tools in STRAP themselves. For bio-informaticians STRAP may serve as an environment for rapid prototyping and testing of complex algorithms such as automatic alignment algorithms or phylogenetic methods. Further, STRAP can be applied as an interactive web applet to present data related to a particular protein family and as a teaching tool. JAVA-1.4 or higher. http://www.charite.de/bioinf/strap/

  17. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    PubMed Central

    Lal, Aparna

    2016-01-01

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change. PMID:26848669

  18. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    PubMed

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  19. Visualization of nuclear particle trajectories in nuclear oil-well logging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Case, C.R.; Chiaramonte, J.M.

    Nuclear oil-well logging measures specific properties of subsurface geological formations as a function of depth in the well. The knowledge gained is used to evaluate the hydrocarbon potential of the surrounding oil field. The measurements are made by lowering an instrument package into an oil well and slowly extracting it at a constant speed. During the extraction phase, neutrons or gamma rays are emitted from the tool, interact with the formation, and scatter back to the detectors located within the tool. Even though only a small percentage of the emitted particles ever reach the detectors, mathematical modeling has been verymore » successful in the accurate prediction of these detector responses. The two dominant methods used to model these devices have been the two-dimensional discrete ordinates method and the three-dimensional Monte Carlo method has routinely been used to investigate the response characteristics of nuclear tools. A special Los Alamos National Laboratory version of their standard MCNP Monte carlo code retains the details of each particle history of later viewing within SABRINA, a companion three-dimensional geometry modeling and debugging code.« less

  20. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  1. Taenia asiatica: the Most Neglected Human Taenia and the Possibility of Cysticercosis

    PubMed Central

    2013-01-01

    Not only Taenia solium and Taenia saginata, but also Taenia asiatica infects humans. The last species is not included in the evaluation of the specificity of the immunodiagnostic techniques for taeniasis/cysticercosis. There is currently no specific immunodiagnostic method for T. asiatica available. Therefore, due to the fact that molecular techniques (the only tool to distinguish the 3 Taenia species) are normally not employed in routine diagnostic methods, the 2 questions concerning T. asiatica (its definite geographic distribution and its ability to cause human cysticercosis), remain open, turning T. asiatica into the most neglected agent of human taeniasis-cysticercosis. PMID:23467406

  2. Regulation of proteasomal degradation by modulating proteasomal initiation regions

    PubMed Central

    Takahashi, Kazunobu; Matouschek, Andreas; Inobe, Tomonao

    2016-01-01

    Methods for regulating the concentrations of specific cellular proteins are valuable tools for biomedical studies. Artificial regulation of protein degradation by the proteasome is receiving increasing attention. Efficient proteasomal protein degradation requires a degron with two components: a ubiquitin tag that is recognized by the proteasome and a disordered region at which the proteasome engages the substrate and initiates degradation. Here we show that degradation rates can be regulated by modulating the disordered initiation region by the binding of modifier molecules, in vitro and in vivo. These results suggest that artificial modulation of proteasome initiation is a versatile method for conditionally inhibiting the proteasomal degradation of specific proteins. PMID:26278914

  3. Recent Advances in Genome Editing Using CRISPR/Cas9

    PubMed Central

    Ding, Yuduan; Li, Hong; Chen, Ling-Ling; Xie, Kabin

    2016-01-01

    The CRISPR (clustered regularly interspaced short palindromic repeat)-Cas9 (CRISPR-associated nuclease 9) system is a versatile tool for genome engineering that uses a guide RNA (gRNA) to target Cas9 to a specific sequence. This simple RNA-guided genome-editing technology has become a revolutionary tool in biology and has many innovative applications in different fields. In this review, we briefly introduce the Cas9-mediated genome-editing method, summarize the recent advances in CRISPR/Cas9 technology, and discuss their implications for plant research. To date, targeted gene knockout using the Cas9/gRNA system has been established in many plant species, and the targeting efficiency and capacity of Cas9 has been improved by optimizing its expression and that of its gRNA. The CRISPR/Cas9 system can also be used for sequence-specific mutagenesis/integration and transcriptional control of target genes. We also discuss off-target effects and the constraint that the protospacer-adjacent motif (PAM) puts on CRISPR/Cas9 genome engineering. To address these problems, a number of bioinformatic tools are available to help design specific gRNAs, and new Cas9 variants and orthologs with high fidelity and alternative PAM specificities have been engineered. Owing to these recent efforts, the CRISPR/Cas9 system is becoming a revolutionary and flexible tool for genome engineering. Adoption of the CRISPR/Cas9 technology in plant research would enable the investigation of plant biology at an unprecedented depth and create innovative applications in precise crop breeding. PMID:27252719

  4. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  5. The application of systems thinking concepts, methods, and tools to global health practices: An analysis of case studies.

    PubMed

    Wilkinson, Jessica; Goff, Morgan; Rusoja, Evan; Hanson, Carl; Swanson, Robert Chad

    2018-06-01

    This review of systems thinking (ST) case studies seeks to compile and analyse cases from ST literature and provide practitioners with a reference for ST in health practice. Particular attention was given to (1) reviewing the frequency and use of key ST terms, methods, and tools in the context of health, and (2) extracting and analysing longitudinal themes across cases. A systematic search of databases was conducted, and a total of 36 case studies were identified. A combination of integrative and inductive qualitative approaches to analysis was used. Most cases identified took place in high-income countries and applied ST retrospectively. The most commonly used ST terms were agent/stakeholder/actor (n = 29), interdependent/interconnected (n = 28), emergence (n = 26), and adaptability/adaptation (n = 26). Common ST methods and tools were largely underutilized. Social network analysis was the most commonly used method (n = 4), and innovation or change management history was the most frequently used tool (n = 11). Four overarching themes were identified; the importance of the interdependent and interconnected nature of a health system, characteristics of leaders in a complex adaptive system, the benefits of using ST, and barriers to implementing ST. This review revealed that while much has been written about the potential benefits of applying ST to health, it has yet to completely transition from theory to practice. There is however evidence of the practical use of an ST lens as well as specific methods and tools. With clear examples of ST applications, the global health community will be better equipped to understand and address key health challenges. © 2017 John Wiley & Sons, Ltd.

  6. Planetary Data Workshop, Part 2

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Technical aspects of the Planetary Data System (PDS) are addressed. Methods and tools for maintaining and accessing large, complex sets of data are discussed. The specific software and applications needed for processing imaging and non-imaging science data are reviewed. The need for specific software that provides users with information on the location and geometry of scientific observations is discussed. Computer networks and user interface to the PDS are covered along with Computer hardware available to this data system.

  7. Molecular-Level Study of the Effect of Prior Axial Compression/Torsion on the Axial-Tensile Strength of PPTA Fibers

    DTIC Science & Technology

    2013-07-16

    Twaron, etc., which are characterized by high specific strength and high specific stiffness. Fibers of this type are often referred to as ‘‘ballistic... high level of penetration resistance against large kinetic energy projectiles, such as bullets, detonated-mine-induced soil ejecta, improvised...increasingly being designed and developed through an extensive use of computer-aided engineering ( CAE ) methods and tools. The utility of these

  8. Axial-Compressive Behavior, Including Kink-Band Formation and Propagation, of Single p-Phenylene Terephthalamide (PPTA) Fibers

    DTIC Science & Technology

    2013-01-01

    material models to describe the behavior of fibers and structures under high -rate loading conditions. With the utility of the CAE methods and tools largely...phenylene terephthalamide (PPTA), available commercially as Kevlar, Twaron, Technora, and so forth, are characterized by high specific axial stiffness...and high specific tensile strength. These fibers are often referred to as “ballistic fibers” since they are commonly used in different ballistic- and

  9. Development of loop-mediated isothermal amplification assay for specific and rapid detection of differential goat pox virus and sheep pox virus.

    PubMed

    Zhao, Zhixun; Fan, Bin; Wu, Guohua; Yan, Xinmin; Li, Yingguo; Zhou, Xiaoli; Yue, Hua; Dai, Xueling; Zhu, Haixia; Tian, Bo; Li, Jian; Zhang, Qiang

    2014-01-17

    Capripox viruses are economically important pathogens in goat and sheep producing areas of the world, with specific focus on goat pox virus (GTPV), sheep pox virus (SPPV) and the Lumpy Skin Disease virus (LSDV). Clinically, sheep pox and goat pox have the same symptoms and cannot be distinguished serologically. This presents a real need for a rapid, inexpensive, and easy to operate and maintain genotyping tool to facilitate accurate disease diagnosis and surveillance for better management of Capripox outbreaks. A LAMP method was developed for the specific differential detection of GTPV and SPPV using three sets of LAMP primers designed on the basis of ITR sequences. Reactions were performed at 62°C for either 45 or 60 min, and specificity confirmed by successful differential detection of several GTPV and SPPV isolates. No cross reactivity with Orf virus, foot-and-mouth disease virus (FMDV), A. marginale Lushi isolate, Mycoplasma mycoides subsp. capri, Chlamydophila psittaci, Theileria ovis, T. luwenshuni, T. uilenbergi or Babesia sp was noted. RFLP-PCR analysis of 135 preserved epidemic materials revealed 48 samples infected with goat pox and 87 infected with sheep pox, with LAMP test results showing a positive detection for all samples. When utilizing GTPV and SPPV genomic DNA, the universal LAMP primers (GSPV) and GTPV LAMP primers displayed a 100% detection rate; while the SPPV LAMP detection rate was 98.8%, consistent with the laboratory tested results. In summary, the three sets of LAMP primers when combined provide an analytically robust method able to fully distinguish between GTPV and SPPV. The presented LAMP method provides a specific, sensitive and rapid diagnostic tool for the distinction of GTPV and SPPV infections, with the potential to be standardized as a detection method for Capripox viruses in endemic areas.

  10. Knowledge acquisition for temporal abstraction.

    PubMed

    Stein, A; Musen, M A; Shahar, Y

    1996-01-01

    Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.

  11. Emerging applications of fluorescence spectroscopy in medical microbiology field.

    PubMed

    Shahzad, Aamir; Köhler, Gottfried; Knapp, Martin; Gaubitzer, Erwin; Puchinger, Martin; Edetsberger, Michael

    2009-11-26

    There are many diagnostic techniques and methods available for diagnosis of medically important microorganisms like bacteria, viruses, fungi and parasites. But, almost all these techniques and methods have some limitations or inconvenience. Most of these techniques are laborious, time consuming and with chances of false positive or false negative results. It warrants the need of a diagnostic technique which can overcome these limitations and problems. At present, there is emerging trend to use Fluorescence spectroscopy as a diagnostic as well as research tool in many fields of medical sciences. Here, we will critically discuss research studies which propose that Fluorescence spectroscopy may be an excellent diagnostic as well as excellent research tool in medical microbiology field with high sensitivity and specificity.

  12. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  13. Googling DNA sequences on the World Wide Web.

    PubMed

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  14. Osteoporosis risk prediction using machine learning and conventional methods.

    PubMed

    Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won

    2013-01-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  15. Determination of Specific Forces and Tool Deflections in Micro-milling of Ti-6Al-4V alloy using Finite Element Simulations and Analysis

    NASA Astrophysics Data System (ADS)

    Farina, Simone; Thepsonti, Thanongsak; Ceretti, Elisabetta; Özel, Tugrul

    2011-05-01

    Titanium alloys offer superb properties in strength, corrosion resistance and biocompatibility and are commonly utilized in medical devices and implants. Micro-end milling process is a direct and rapid fabrication method for manufacturing medical devices and implants in titanium alloys. Process performance and quality depend upon an understanding of the relationship between cutting parameters and forces and resultant tool deflections to avoid tool breakage. For this purpose, FE simulations of chip formation during micro-end milling of Ti-6Al-4V alloy with an ultra-fine grain solid carbide two-flute micro-end mill are investigated using DEFORM software. At first, specific forces in tangential and radial directions of cutting during micro-end milling for varying feed advance and rotational speeds have been determined using designed FE simulations for chip formation process. Later, these forces are applied to the micro-end mill geometry along the axial depth of cut in 3D analysis of ABAQUS. Consequently, 3D distributions for tool deflections & von Misses stress are determined. These analyses will yield in establishing integrated multi-physics process models for high performance micro-end milling and a leap-forward to process improvements.

  16. PASTA: splice junction identification from RNA-Sequencing data

    PubMed Central

    2013-01-01

    Background Next generation transcriptome sequencing (RNA-Seq) is emerging as a powerful experimental tool for the study of alternative splicing and its regulation, but requires ad-hoc analysis methods and tools. PASTA (Patterned Alignments for Splicing and Transcriptome Analysis) is a splice junction detection algorithm specifically designed for RNA-Seq data, relying on a highly accurate alignment strategy and on a combination of heuristic and statistical methods to identify exon-intron junctions with high accuracy. Results Comparisons against TopHat and other splice junction prediction software on real and simulated datasets show that PASTA exhibits high specificity and sensitivity, especially at lower coverage levels. Moreover, PASTA is highly configurable and flexible, and can therefore be applied in a wide range of analysis scenarios: it is able to handle both single-end and paired-end reads, it does not rely on the presence of canonical splicing signals, and it uses organism-specific regression models to accurately identify junctions. Conclusions PASTA is a highly efficient and sensitive tool to identify splicing junctions from RNA-Seq data. Compared to similar programs, it has the ability to identify a higher number of real splicing junctions, and provides highly annotated output files containing detailed information about their location and characteristics. Accurate junction data in turn facilitates the reconstruction of the splicing isoforms and the analysis of their expression levels, which will be performed by the remaining modules of the PASTA pipeline, still under development. Use of PASTA can therefore enable the large-scale investigation of transcription and alternative splicing. PMID:23557086

  17. Creating and parameterizing patient-specific deep brain stimulation pathway-activation models using the hyperdirect pathway as an example

    PubMed Central

    Gunalan, Kabilar; Chaturvedi, Ashutosh; Howell, Bryan; Duchin, Yuval; Lempka, Scott F.; Patriat, Remi; Sapiro, Guillermo; Harel, Noam; McIntyre, Cameron C.

    2017-01-01

    Background Deep brain stimulation (DBS) is an established clinical therapy and computational models have played an important role in advancing the technology. Patient-specific DBS models are now common tools in both academic and industrial research, as well as clinical software systems. However, the exact methodology for creating patient-specific DBS models can vary substantially and important technical details are often missing from published reports. Objective Provide a detailed description of the assembly workflow and parameterization of a patient-specific DBS pathway-activation model (PAM) and predict the response of the hyperdirect pathway to clinical stimulation. Methods Integration of multiple software tools (e.g. COMSOL, MATLAB, FSL, NEURON, Python) enables the creation and visualization of a DBS PAM. An example DBS PAM was developed using 7T magnetic resonance imaging data from a single unilaterally implanted patient with Parkinson’s disease (PD). This detailed description implements our best computational practices and most elaborate parameterization steps, as defined from over a decade of technical evolution. Results Pathway recruitment curves and strength-duration relationships highlight the non-linear response of axons to changes in the DBS parameter settings. Conclusion Parameterization of patient-specific DBS models can be highly detailed and constrained, thereby providing confidence in the simulation predictions, but at the expense of time demanding technical implementation steps. DBS PAMs represent new tools for investigating possible correlations between brain pathway activation patterns and clinical symptom modulation. PMID:28441410

  18. The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Chen, Jundong

    2018-03-01

    Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.

  19. Recent trends related to the use of formal methods in software engineering

    NASA Technical Reports Server (NTRS)

    Prehn, Soren

    1986-01-01

    An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.

  20. Robust detection of rare species using environmental DNA: the importance of primer specificity.

    PubMed

    Wilcox, Taylor M; McKelvey, Kevin S; Young, Michael K; Jane, Stephen F; Lowe, Winsor H; Whiteley, Andrew R; Schwartz, Michael K

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probe-based chemistries may represent a particularly powerful tool because of the method's sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence of closely related, sympatric taxa. If related species cause any cross-amplification or interference, false positives and negatives may be generated. These errors can be disastrous if false positives lead to overestimate the abundance of an endangered species or if false negatives prevent detection of an invasive species. In this study we test factors that influence the specificity and sensitivity of TaqMan MGB assays using co-occurring, closely related brook trout (Salvelinus fontinalis) and bull trout (S. confluentus) as a case study. We found qPCR to be substantially more sensitive than traditional PCR, with a high probability of detection at concentrations as low as 0.5 target copies/µl. We also found that number and placement of base pair mismatches between the Taqman MGB assay and non-target templates was important to target specificity, and that specificity was most influenced by base pair mismatches in the primers, rather than in the probe. We found that insufficient specificity can result in both false positive and false negative results, particularly in the presence of abundant related species. Our results highlight the utility of qPCR as a highly sensitive eDNA tool, and underscore the importance of careful assay design.

  1. Examining the effectiveness of discriminant function analysis and cluster analysis in species identification of male field crickets based on their calling songs.

    PubMed

    Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini

    2013-01-01

    Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6-7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification.

  2. Preliminary Evaluation of Method to Monitor Landfills Resilience against Methane Emission

    NASA Astrophysics Data System (ADS)

    Chusna, Noor Amalia; Maryono, Maryono

    2018-02-01

    Methane emission from landfill sites contribute to global warming and un-proper methane treatment can pose an explosion hazard. Stakeholder and government in the cities in Indonesia been found significant difficulties to monitor the resilience of landfill from methane emission. Moreover, the management of methane gas has always been a challenging issue for long waste management service and operations. Landfills are a significant contributor to anthropogenic methane emissions. This study conducted preliminary evaluation of method to manage methane gas emission by assessing LandGem and IPCC method. From the preliminary evaluation, this study found that the IPCC method is based on the availability of current and historical country specific data regarding the waste disposed of in landfills while from the LandGEM method is an automated tool for estimating emission rates for total landfill gas this method account total gas of methane, carbon dioxide and other. The method can be used either with specific data to estimate emissions in the site or default parameters if no site-specific data are available. Both of method could be utilize to monitor the methane emission from landfill site in cities of Central Java.

  3. Evaluation of Targeted Next-Generation Sequencing for Detection of Bovine Pathogens in Clinical Samples.

    PubMed

    Anis, Eman; Hawkins, Ian K; Ilha, Marcia R S; Woldemeskel, Moges W; Saliki, Jeremiah T; Wilkes, Rebecca P

    2018-07-01

    The laboratory diagnosis of infectious diseases, especially those caused by mixed infections, is challenging. Routinely, it requires submission of multiple samples to separate laboratories. Advances in next-generation sequencing (NGS) have provided the opportunity for development of a comprehensive method to identify infectious agents. This study describes the use of target-specific primers for PCR-mediated amplification with the NGS technology in which pathogen genomic regions of interest are enriched and selectively sequenced from clinical samples. In the study, 198 primers were designed to target 43 common bovine and small-ruminant bacterial, fungal, viral, and parasitic pathogens, and a bioinformatics tool was specifically constructed for the detection of targeted pathogens. The primers were confirmed to detect the intended pathogens by testing reference strains and isolates. The method was then validated using 60 clinical samples (including tissues, feces, and milk) that were also tested with other routine diagnostic techniques. The detection limits of the targeted NGS method were evaluated using 10 representative pathogens that were also tested by quantitative PCR (qPCR), and the NGS method was able to detect the organisms from samples with qPCR threshold cycle ( C T ) values in the 30s. The method was successful for the detection of multiple pathogens in the clinical samples, including some additional pathogens missed by the routine techniques because the specific tests needed for the particular organisms were not performed. The results demonstrate the feasibility of the approach and indicate that it is possible to incorporate NGS as a diagnostic tool in a cost-effective manner into a veterinary diagnostic laboratory. Copyright © 2018 Anis et al.

  4. EBIC Characterization and Hydrogen Passivation in Silicon Sheet

    NASA Technical Reports Server (NTRS)

    Hanoka, J. I.

    1985-01-01

    As a general qualitative tool, the electron beam induced current (EBIC) method can be very useful in imaging recombination in silicon sheet used for solar cells. Work using EBIC on EFG silicon ribbon is described. In particular, some efforts at making the technique more quantitative and hence more useful, some limitations of the method, and finally specific application to hydrogen passivation is treated. Some brief remarks are made regarding the technique itself.

  5. DESIGN AND CONSTRUCTION OF SCHOOL BUILDINGS. PROCEEDINGS, ASSOCIATION OF SCHOOL BUSINESS OFFICIALS OF THE UNITED STATES AND CANADA, ANNUAL MEETING AND EDUCATIONAL EXHIBIT, (50TH SAN FRANCISCO, CALIFORNIA, OCTOBER 71-22, 1964).

    ERIC Educational Resources Information Center

    LIEBESKIND, MORRIS

    PROBLEMS IN THE SCHEDULING AND COMPLETION OF SCHOOL BUILDING DESIGN AND CONSTRUCTION PROJECTS ARE DISCUSSED WITH REFERENCE TO THE CRITICAL PATH METHOD OF PROGRAMING. THE DISCUSSION GIVES A BROAD OVERVIEW OF THE METHOD WITH DETAILED SUGGESTIONS FOR SCHOOL ADMINISTRATORS. SPECIFIC SUBJECT AREAS INCLUDE--(1) CPM, A NEW MANAGEMENT TOOL, (2) CPM…

  6. On the Use of Accelerated Aging Methods for Screening High Temperature Polymeric Composite Materials

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Grayson, Michael A.

    1999-01-01

    A rational approach to the problem of accelerated testing of high temperature polymeric composites is discussed. The methods provided are considered tools useful in the screening of new materials systems for long-term application to extreme environments that include elevated temperature, moisture, oxygen, and mechanical load. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for specific aging mechanisms.

  7. Sensitivity and specificity of the Eating Assessment Tool and the Volume-Viscosity Swallow Test for clinical evaluation of oropharyngeal dysphagia.

    PubMed

    Rofes, L; Arreola, V; Mukherjee, R; Clavé, P

    2014-09-01

    Oropharyngeal dysphagia (OD) is an underdiagnosed digestive disorder that causes severe nutritional and respiratory complications. Our aim was to determine the accuracy of the Eating Assessment Tool (EAT-10) and the Volume-Viscosity Swallow Test (V-VST) for clinical evaluation of OD. We studied 120 patients with swallowing difficulties and 14 healthy subjects. OD was evaluated by the 10-item screening questionnaire EAT-10 and the bedside method V-VST, videofluoroscopy (VFS) being the reference standard. The V-VST is an effort test that uses boluses of different volumes and viscosities to identify clinical signs of impaired efficacy (impaired labial seal, piecemeal deglutition, and residue) and impaired safety of swallow (cough, voice changes, and oxygen desaturation ≥3%). Discriminating ability was assessed by the AUC of the ROC curve and sensitivity and specificity values. According to VFS, prevalence of OD was 87%, 75.6% with impaired efficacy and 80.9% with impaired safety of swallow including 17.6% aspirations. The EAT-10 showed a ROC AUC of 0.89 for OD with an optimal cut-off at 2 (0.89 sensitivity and 0.82 specificity). The V-VST showed 0.94 sensitivity and 0.88 specificity for OD, 0.79 sensitivity and 0.75 specificity for impaired efficacy, 0.87 sensitivity and 0.81 specificity for impaired safety, and 0.91 sensitivity and 0.28 specificity for aspirations. Clinical methods for screening (EAT-10) and assessment (V-VST) of OD offer excellent psychometric proprieties that allow adequate management of OD. Their universal application among at-risk populations will improve the identification of patients with OD at risk for malnutrition and aspiration pneumonia. © 2014 The Authors. Neurogastroenterology & Motility published by John Wiley & Sons Ltd.

  8. Inverse current source density method in two dimensions: inferring neural activation from multielectrode recordings.

    PubMed

    Łęski, Szymon; Pettersen, Klas H; Tunstall, Beth; Einevoll, Gaute T; Gigg, John; Wójcik, Daniel K

    2011-12-01

    The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a two-dimensional grid using multi-electrode rectangular arrays. This new method, which we call two-dimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one- and three-dimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include system-specific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUI-based MATLAB toolbox to analyze and visualize our test data as well as user datasets.

  9. ADAM: analysis of discrete models of biological systems using computer algebra.

    PubMed

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.

  10. Validation of a modified FRAX® tool for improving outpatient efficiency--part of the "Catch Before a Fall" initiative.

    PubMed

    Parker, Simon; Ciaccio, Maria; Cook, Erica; Davenport, Graham; Cooper, Alun; Grange, Simon; Smitham, Peter

    2015-01-01

    We have validated our touch-screen-modified FRAX® tool against the traditional healthcare professional-led questionnaire, demonstrating strong concordance between doctor- and patient-derived results. We will use this in outpatient clinics and general practice to increase our capture rate of at-risk patients, making valuable use of otherwise wasted patient waiting times. Outpatient clinics offer an opportunity to collect valuable health information from a captive population. We have previously developed a modified fracture risk assessment (FRAX®) tool, enabling patients to self-assess their osteoporotic fracture risk in a touch-screen computer format and demonstrated its acceptability with patients. We aim to validate the accuracy of our tool against the traditional questionnaire. Fifty patients over 50 years of age within the fracture clinic independently completed a paper equivalent of our touch-screen-modified FRAX® questionnaire. Responses were analysed against the traditional healthcare professional (HCP)-led questionnaire which was carried out afterwards. Correlation was assessed by sensitivity, specificity, Cohen's kappa statistic and Fisher's exact test for each potential FRAX® outcome of "treat", "measure BMD" and "lifestyle advice". Age range was 51-98 years. The FRAX® tool was completed by 88 % of patients; six patients lacked confidence in estimating either their height or weight. Following question adjustment according to patient response and feedback, our tool achieved >95 % sensitivity and specificity for the "treat" and "lifestyle advice" groups, and 79 % sensitivity and 100 % specificity in the "measure BMD" group. Cohen's kappa value ranged from 0.823 to 0.995 across all groups, demonstrating "very good" agreement for all. Fisher's exact test demonstrated significant concordance between doctor and patient decisions. Our modified tool provides a simple, accurate and reliable method for patients to self-report their own FRAX® score outside the clinical contact period, thus releasing the HCP from the time required to complete the questionnaire and potentially increasing our capture rate of at-risk patients.

  11. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  12. The BMPix and PEAK Tools: New Methods for Automated Laminae Recognition and Counting - Application to Glacial Varves From Antarctic Marine Sediment

    NASA Astrophysics Data System (ADS)

    Weber, M. E.; Reichelt, L.; Kuhn, G.; Thurow, J. W.; Ricken, W.

    2009-12-01

    We present software-based tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray-scale curves from images at ultrahigh (pixel) resolution. The PEAK tool uses the gray-scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a Gaussian smoothed gray-scale curve. The zero-crossing algorithm counts every positive and negative halfway-passage of the gray-scale curve through a wide moving average. Hence, the record is separated into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the gray-scale curve into its frequency components, before positive and negative passages are count. We applied the new methods successfully to tree rings and to well-dated and already manually counted marine varves from Saanich Inlet before we adopted the tools to rather complex marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that the laminations from three Weddell Sea sites represent true varves that were deposited on sediment ridges over several millennia during the last glacial maximum (LGM). There are apparently two seasonal layers of terrigenous composition, a coarser-grained bright layer, and a finer-grained dark layer. The new tools offer several advantages over previous tools. The counting procedures are based on a moving average generated from gray-scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Since PEAK associates counts with a specific depth, the thickness of each year or each season is also measured which is an important prerequisite for later spectral analysis. Since all information required to conduct the analysis is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.

  13. EFS: an ensemble feature selection tool implemented as R-package and web-application.

    PubMed

    Neumann, Ursula; Genze, Nikita; Heider, Dominik

    2017-01-01

    Feature selection methods aim at identifying a subset of features that improve the prediction performance of subsequent classification models and thereby also simplify their interpretability. Preceding studies demonstrated that single feature selection methods can have specific biases, whereas an ensemble feature selection has the advantage to alleviate and compensate for these biases. The software EFS (Ensemble Feature Selection) makes use of multiple feature selection methods and combines their normalized outputs to a quantitative ensemble importance. Currently, eight different feature selection methods have been integrated in EFS, which can be used separately or combined in an ensemble. EFS identifies relevant features while compensating specific biases of single methods due to an ensemble approach. Thereby, EFS can improve the prediction accuracy and interpretability in subsequent binary classification models. EFS can be downloaded as an R-package from CRAN or used via a web application at http://EFS.heiderlab.de.

  14. In situ induction of dendritic cell–based T cell tolerance in humanized mice and nonhuman primates

    PubMed Central

    Jung, Kyeong Cheon; Jeon, Yoon Kyung; Ban, Young Larn; Min, Hye Sook; Kim, Eun Ji; Kim, Ju Hyun; Kang, Byung Hyun; Bae, Youngmee; Yoon, Il-Hee; Kim, Yong-Hee; Lee, Jae-Il; Kim, Jung-Sik; Shin, Jun-Seop; Yang, Jaeseok; Kim, Sung Joo; Rostlund, Emily; Muller, William A.

    2011-01-01

    Induction of antigen-specific T cell tolerance would aid treatment of diverse immunological disorders and help prevent allograft rejection and graft versus host disease. In this study, we establish a method of inducing antigen-specific T cell tolerance in situ in diabetic humanized mice and Rhesus monkeys receiving porcine islet xenografts. Antigen-specific T cell tolerance is induced by administration of an antibody ligating a particular epitope on ICAM-1 (intercellular adhesion molecule 1). Antibody-mediated ligation of ICAM-1 on dendritic cells (DCs) led to the arrest of DCs in a semimature stage in vitro and in vivo. Ablation of DCs from mice completely abrogated anti–ICAM-1–induced antigen-specific T cell tolerance. T cell responses to unrelated antigens remained unaffected. In situ induction of DC-mediated T cell tolerance using this method may represent a potent therapeutic tool for preventing graft rejection. PMID:22025302

  15. Textpresso site-specific recombinases: A text-mining server for the recombinase literature including Cre mice and conditional alleles.

    PubMed

    Urbanski, William M; Condie, Brian G

    2009-12-01

    Textpresso Site Specific Recombinases (http://ssrc.genetics.uga.edu/) is a text-mining web server for searching a database of more than 9,000 full-text publications. The papers and abstracts in this database represent a wide range of topics related to site-specific recombinase (SSR) research tools. Included in the database are most of the papers that report the characterization or use of mouse strains that express Cre recombinase as well as papers that describe or analyze mouse lines that carry conditional (floxed) alleles or SSR-activated transgenes/knockins. The database also includes reports describing SSR-based cloning methods such as the Gateway or the Creator systems, papers reporting the development or use of SSR-based tools in systems such as Drosophila, bacteria, parasites, stem cells, yeast, plants, zebrafish, and Xenopus as well as publications that describe the biochemistry, genetics, or molecular structure of the SSRs themselves. Textpresso Site Specific Recombinases is the only comprehensive text-mining resource available for the literature describing the biology and technical applications of SSRs. (c) 2009 Wiley-Liss, Inc.

  16. Impact of mHealth Chronic Disease Management on Treatment Adherence and Patient Outcomes: A Systematic Review

    PubMed Central

    Hamine, Saee; Faulx, Dunia; Green, Beverly B; Ginsburg, Amy Sarah

    2015-01-01

    Background Adherence to chronic disease management is critical to achieving improved health outcomes, quality of life, and cost-effective health care. As the burden of chronic diseases continues to grow globally, so does the impact of non-adherence. Mobile technologies are increasingly being used in health care and public health practice (mHealth) for patient communication, monitoring, and education, and to facilitate adherence to chronic diseases management. Objective We conducted a systematic review of the literature to evaluate the effectiveness of mHealth in supporting the adherence of patients to chronic diseases management (“mAdherence”), and the usability, feasibility, and acceptability of mAdherence tools and platforms in chronic disease management among patients and health care providers. Methods We searched PubMed, Embase, and EBSCO databases for studies that assessed the role of mAdherence in chronic disease management of diabetes mellitus, cardiovascular disease, and chronic lung diseases from 1980 through May 2014. Outcomes of interest included effect of mHealth on patient adherence to chronic diseases management, disease-specific clinical outcomes after intervention, and the usability, feasibility, and acceptability of mAdherence tools and platforms in chronic disease management among target end-users. Results In all, 107 articles met all inclusion criteria. Short message service was the most commonly used mAdherence tool in 40.2% (43/107) of studies. Usability, feasibility, and acceptability or patient preferences for mAdherence interventions were assessed in 57.9% (62/107) of studies and found to be generally high. A total of 27 studies employed randomized controlled trial (RCT) methods to assess impact on adherence behaviors, and significant improvements were observed in 15 of those studies (56%). Of the 41 RCTs that measured effects on disease-specific clinical outcomes, significant improvements between groups were reported in 16 studies (39%). Conclusions There is potential for mHealth tools to better facilitate adherence to chronic disease management, but the evidence supporting its current effectiveness is mixed. Further research should focus on understanding and improving how mHealth tools can overcome specific barriers to adherence. PMID:25803266

  17. Species-specific diagnostic assays for Bonamia ostreae and B. exitiosa in European flat oyster Ostrea edulis: conventional, real-time and multiplex PCR.

    PubMed

    Ramilo, Andrea; Navas, J Ignacio; Villalba, Antonio; Abollo, Elvira

    2013-05-27

    Bonamia ostreae and B. exitiosa have caused mass mortalities of various oyster species around the world and co-occur in some European areas. The World Organisation for Animal Health (OIE) has included infections with both species in the list of notifiable diseases. However, official methods for species-specific diagnosis of either parasite have certain limitations. In this study, new species-specific conventional PCR (cPCR) and real-time PCR techniques were developed to diagnose each parasite species. Moreover, a multiplex PCR method was designed to detect both parasites in a single assay. The analytical sensitivity and specificity of each new method were evaluated. These new procedures were compared with 2 OIE-recommended methods, viz. standard histology and PCR-RFLP. The new procedures showed higher sensitivity than the OIE recommended ones for the diagnosis of both species. The sensitivity of tests with the new primers was higher using oyster gills and gonad tissue, rather than gills alone. The lack of a 'gold standard' prevented accurate estimation of sensitivity and specificity of the new methods. The implementation of statistical tools (maximum likelihood method) for the comparison of the diagnostic tests showed the possibility of false positives with the new procedures, although the absence of a gold standard precluded certainty. Nevertheless, all procedures showed negative results when used for the analysis of oysters from a Bonamia-free area.

  18. Assessment of eutrophication in estuaries: Pressure-state-response and source apportionment

    Treesearch

    David Whitall; Suzanne Bricker

    2006-01-01

    The National Estuarine Eutrophication Assessment (NEEA) Update Program is a management oriented program designed to improve monitoring and assessment efforts through the development of type specific classification of estuaries that will allow improved assessment methods and development of analytical and research models and tools for managers which will help guide and...

  19. Photovoice as Participatory Action Research Tool for Engaging People with Intellectual Disabilities in Research and Program Development

    ERIC Educational Resources Information Center

    Jurkowski, Janine M.

    2008-01-01

    People with intellectual disabilities have few opportunities to actively participate in research affecting programs and policies. Employment of participatory action research has been recommended. Although use of this approach with people who have intellectual disabilities is growing, articles on specific participatory research methods are rare.…

  20. Undergrad and Overweight: An Online Behavioral Weight Management Program for College Students

    ERIC Educational Resources Information Center

    Harvey-Berino, Jean; Pope, Lizzy; Gold, Beth Casey; Leonard, Heather; Belliveau, Cynthia

    2012-01-01

    Objective: Explore the feasibility of an online behavioral weight management program for college students. Methods: The program focused on behavioral strategies to modify eating and exercise behaviors of students interested in losing weight and/or developing a healthy lifestyle. Specific tools included weekly chat meetings with a facilitator,…

  1. A Rhetorical Analysis of the Self in an Organization: The Production and Reception of Discourse in a Bank.

    ERIC Educational Resources Information Center

    Roberts, Joy S.

    1999-01-01

    Describes briefly the author's research (contributing to scholarship on successful language practices in organizations) examining the conflicts, and specifically the discursive methods of solving these conflicts, faced by individuals within an organization as they negotiate competing demands. Offers a new tool (called Bracketing, Ranking, and…

  2. How Do Teachers Prioritize the Adoption of Technology in the Classroom?

    ERIC Educational Resources Information Center

    Kurt, Serhat

    2012-01-01

    This study examined whether teachers prioritize the use of technology. More specifically, this paper focused on how Turkish teachers think about the importance of technology and technological tools for their daily routines. The research design employed both qualitative and quantitative methods. The data were collected through document analyses,…

  3. A Web-Based Treatment Decision Support Tool for Patients With Advanced Knee Arthritis: Evaluation of User Interface and Content Design

    PubMed Central

    Zheng, Hua; Rosal, Milagros C; Li, Wenjun; Borg, Amy; Yang, Wenyun; Ayers, David C

    2018-01-01

    Background Data-driven surgical decisions will ensure proper use and timing of surgical care. We developed a Web-based patient-centered treatment decision and assessment tool to guide treatment decisions among patients with advanced knee osteoarthritis who are considering total knee replacement surgery. Objective The aim of this study was to examine user experience and acceptance of the Web-based treatment decision support tool among older adults. Methods User-centered formative and summative evaluations were conducted for the tool. A sample of 28 patients who were considering total knee replacement participated in the study. Participants’ responses to the user interface design, the clarity of information, as well as usefulness, satisfaction, and acceptance of the tool were collected through qualitative (ie, individual patient interviews) and quantitative (ie, standardized Computer System Usability Questionnaire) methods. Results Participants were older adults with a mean age of 63 (SD 11) years. Three-quarters of them had no technical questions using the tool. User interface design recommendations included larger fonts, bigger buttons, less colors, simpler navigation without extra “next page” click, less mouse movement, and clearer illustrations with simple graphs. Color-coded bar charts and outcome-specific graphs with positive action were easiest for them to understand the outcomes data. Questionnaire data revealed high satisfaction with the tool usefulness and interface quality, and also showed ease of use of the tool, regardless of age or educational status. Conclusions We evaluated the usability of a patient-centered decision support tool designed for advanced knee arthritis patients to facilitate their knee osteoarthritis treatment decision making. The lessons learned can inform other decision support tools to improve interface and content design for older patients’ use. PMID:29712620

  4. A General Tool for Engineering the NAD/NADP Cofactor Preference of Oxidoreductases.

    PubMed

    Cahn, Jackson K B; Werlang, Caroline A; Baumschlager, Armin; Brinkmann-Chen, Sabine; Mayo, Stephen L; Arnold, Frances H

    2017-02-17

    The ability to control enzymatic nicotinamide cofactor utilization is critical for engineering efficient metabolic pathways. However, the complex interactions that determine cofactor-binding preference render this engineering particularly challenging. Physics-based models have been insufficiently accurate and blind directed evolution methods too inefficient to be widely adopted. Building on a comprehensive survey of previous studies and our own prior engineering successes, we present a structure-guided, semirational strategy for reversing enzymatic nicotinamide cofactor specificity. This heuristic-based approach leverages the diversity and sensitivity of catalytically productive cofactor binding geometries to limit the problem to an experimentally tractable scale. We demonstrate the efficacy of this strategy by inverting the cofactor specificity of four structurally diverse NADP-dependent enzymes: glyoxylate reductase, cinnamyl alcohol dehydrogenase, xylose reductase, and iron-containing alcohol dehydrogenase. The analytical components of this approach have been fully automated and are available in the form of an easy-to-use web tool: Cofactor Specificity Reversal-Structural Analysis and Library Design (CSR-SALAD).

  5. The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.

    PubMed

    Schubert, Christian R; Stultz, Collin M

    2009-08-01

    Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.

  6. The National Clinical Assessment Tool for Medical Students in the Emergency Department (NCAT-EM)

    PubMed Central

    Jung, Julianna; Franzen, Douglas; Lawson, Luan; Manthey, David; Tews, Matthew; Dubosh, Nicole; Fisher, Jonathan; Haughey, Marianne; House, Joseph B.; Trainor, Arleigh; Wald, David A.; Hiller, Katherine

    2018-01-01

    Introduction Clinical assessment of medical students in emergency medicine (EM) clerkships is a highly variable process that presents unique challenges and opportunities. Currently, clerkship directors use institution-specific tools with unproven validity and reliability that may or may not address competencies valued most highly in the EM setting. Standardization of assessment practices and development of a common, valid, specialty-specific tool would benefit EM educators and students. Methods A two-day national consensus conference was held in March 2016 in the Clerkship Directors in Emergency Medicine (CDEM) track at the Council of Residency Directors in Emergency Medicine (CORD) Academic Assembly in Nashville, TN. The goal of this conference was to standardize assessment practices and to create a national clinical assessment tool for use in EM clerkships across the country. Conference leaders synthesized the literature, articulated major themes and questions pertinent to clinical assessment of students in EM, clarified the issues, and outlined the consensus-building process prior to consensus-building activities. Results The first day of the conference was dedicated to developing consensus on these key themes in clinical assessment. The second day of the conference was dedicated to discussing and voting on proposed domains to be included in the national clinical assessment tool. A modified Delphi process was initiated after the conference to reconcile questions and items that did not reach an a priori level of consensus. Conclusion The final tool, the National Clinical Assessment Tool for Medical Students in Emergency Medicine (NCAT-EM) is presented here. PMID:29383058

  7. The Promise and the Challenge of Technology-Facilitated Methods for Assessing Behavioral and Cognitive Markers of Risk for Suicide among U.S. Army National Guard Personnel.

    PubMed

    Baucom, Brian R W; Georgiou, Panayiotis; Bryan, Craig J; Garland, Eric L; Leifker, Feea; May, Alexis; Wong, Alexander; Narayanan, Shrikanth S

    2017-03-31

    Suicide was the 10th leading cause of death for Americans in 2015 and rates have been steadily climbing over the last 25 years. Rates are particularly high amongst U.S. military personnel. Suicide prevention efforts in the military are significantly hampered by the lack of: (1) assessment tools for measuring baseline risk and (2) methods to detect periods of particularly heightened risk. Two specific barriers to assessing suicide risk in military personnel that call for innovation are: (1) the geographic dispersion of military personnel from healthcare settings, particularly amongst components like the Reserves; and (2) professional and social disincentives to acknowledging psychological distress. The primary aim of this paper is to describe recent technological developments that could contribute to risk assessment tools that are not subject to the limitations mentioned above. More specifically, Behavioral Signal Processing can be used to assess behaviors during interaction and conversation that likely indicate increased risk for suicide, and computer-administered, cognitive performance tasks can be used to assess activation of the suicidal mode. These novel methods can be used remotely and do not require direct disclosure or endorsement of psychological distress, solving two challenges to suicide risk assessment in military and other sensitive settings. We present an introduction to these technologies, describe how they can specifically be applied to assessing behavioral and cognitive risk for suicide, and close with recommendations for future research.

  8. The Promise and the Challenge of Technology-Facilitated Methods for Assessing Behavioral and Cognitive Markers of Risk for Suicide among U.S. Army National Guard Personnel

    PubMed Central

    Baucom, Brian R. W.; Georgiou, Panayiotis; Bryan, Craig J.; Garland, Eric L.; Leifker, Feea; May, Alexis; Wong, Alexander; Narayanan, Shrikanth S.

    2017-01-01

    Suicide was the 10th leading cause of death for Americans in 2015 and rates have been steadily climbing over the last 25 years. Rates are particularly high amongst U.S. military personnel. Suicide prevention efforts in the military are significantly hampered by the lack of: (1) assessment tools for measuring baseline risk and (2) methods to detect periods of particularly heightened risk. Two specific barriers to assessing suicide risk in military personnel that call for innovation are: (1) the geographic dispersion of military personnel from healthcare settings, particularly amongst components like the Reserves; and (2) professional and social disincentives to acknowledging psychological distress. The primary aim of this paper is to describe recent technological developments that could contribute to risk assessment tools that are not subject to the limitations mentioned above. More specifically, Behavioral Signal Processing can be used to assess behaviors during interaction and conversation that likely indicate increased risk for suicide, and computer-administered, cognitive performance tasks can be used to assess activation of the suicidal mode. These novel methods can be used remotely and do not require direct disclosure or endorsement of psychological distress, solving two challenges to suicide risk assessment in military and other sensitive settings. We present an introduction to these technologies, describe how they can specifically be applied to assessing behavioral and cognitive risk for suicide, and close with recommendations for future research. PMID:28362333

  9. Neuronavigation using three-dimensional proton magnetic resonance spectroscopy data.

    PubMed

    Kanberoglu, Berkay; Moore, Nina Z; Frakes, David; Karam, Lina J; Debbins, Josef P; Preul, Mark C

    2014-01-01

    Applications in clinical medicine can benefit from fusion of spectroscopy data with anatomical imagery. For example, new 3-dimensional (3D) spectroscopy techniques allow for improved correlation of metabolite profiles with specific regions of interest in anatomical tumor images, which can be useful in characterizing and treating heterogeneous tumors that appear structurally homogeneous. We sought to develop a clinical workflow and uniquely capable custom software tool to integrate advanced 3-tesla 3D proton magnetic resonance spectroscopic imaging ((1)H-MRSI) into industry standard image-guided neuronavigation systems, especially for use in brain tumor surgery. (1)H-MRSI spectra from preoperative scanning on 15 patients with recurrent or newly diagnosed meningiomas were processed and analyzed, and specific voxels were selected based on their chemical contents. 3D neuronavigation overlays were then generated and applied to anatomical image data in the operating room. The proposed 3D methods fully account for scanner calibration and comprise tools that we have now made publicly available. The new methods were quantitatively validated through a phantom study and applied successfully to mitigate biopsy uncertainty in a clinical study of meningiomas. The proposed methods improve upon the current state of the art in neuronavigation through the use of detailed 3D (1)H-MRSI data. Specifically, 3D MRSI-based overlays provide comprehensive, quantitative visual cues and location information during neurosurgery, enabling a progressive new form of online spectroscopy-guided neuronavigation. © 2014 S. Karger AG, Basel.

  10. Student Use of NABPLaw Online in a Pharmacy Laws Project

    PubMed Central

    Hammer, Dana P.; Hartnett, Cassandra J.; Williams, Donald H.

    2006-01-01

    Objectives To evaluate students’ frequency of use and degree of usefulness of NABPLaw Online, a pharmacy-specific, online, licensed resource produced by the National Association of Boards of Pharmacy (NABP). Methods Students usage of various information resources, including NABPLaw Online were evaluated through (1) usage statistics gathered by NABP, (2) students’ response to a questionnaire, and (3) citation analysis performed on students’ project reports. Results Students used NABPLaw Online less frequently than other online tools, partly related to the relevance of the tool to their projects, and partly related to ease of use in comparison to other tools. Conclusions Although it was not extensively used, NABPLaw Online represents a unique resource for students researching multistate aspects of pharmacy practice law. PMID:17149444

  11. GenomePeek—an online tool for prokaryotic genome and metagenome analysis

    DOE PAGES

    McNair, Katelyn; Edwards, Robert A.

    2015-06-16

    As increases in prokaryotic sequencing take place, a method to quickly and accurately analyze this data is needed. Previous tools are mainly designed for metagenomic analysis and have limitations; such as long runtimes and significant false positive error rates. The online tool GenomePeek (edwards.sdsu.edu/GenomePeek) was developed to analyze both single genome and metagenome sequencing files, quickly and with low error rates. GenomePeek uses a sequence assembly approach where reads to a set of conserved genes are extracted, assembled and then aligned against the highly specific reference database. GenomePeek was found to be faster than traditional approaches while still keeping errormore » rates low, as well as offering unique data visualization options.« less

  12. Microfluidic tools toward industrial biotechnology.

    PubMed

    Oliveira, Aline F; Pessoa, Amanda C S N; Bastos, Reinaldo G; de la Torre, Lucimara G

    2016-11-01

    Microfluidics is a technology that operates with small amounts of fluids and makes possible the investigation of cells, enzymes, and biomolecules and encapsulation of biocatalysts in a greater variety of conditions than permitted using conventional methods. This review discusses technological possibilities that can be applied in the field of industrial biotechnology, presenting the principal definitions and fundamental aspects of microfluidic parameters to better understand advanced approaches. Specifically, concentration gradient generators, droplet-based microfluidics, and microbioreactors are explored as useful tools that can contribute to industrial biotechnology. These tools present potential applications, inclusive as commercial platforms to optimizing in bioprocesses development as screening cells, encapsulating biocatalysts, and determining critical kinetic parameters. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1372-1389, 2016. © 2016 American Institute of Chemical Engineers.

  13. Computer-enhanced visual learning method: a paradigm to teach and document surgical skills.

    PubMed

    Maizels, Max; Mickelson, Jennie; Yerkes, Elizabeth; Maizels, Evelyn; Stork, Rachel; Young, Christine; Corcoran, Julia; Holl, Jane; Kaplan, William E

    2009-09-01

    Changes in health care are stimulating residency training programs to develop new methods for teaching surgical skills. We developed Computer-Enhanced Visual Learning (CEVL) as an innovative Internet-based learning and assessment tool. The CEVL method uses the educational procedures of deliberate practice and performance to teach and learn surgery in a stylized manner. CEVL is a learning and assessment tool that can provide students and educators with quantitative feedback on learning a specific surgical procedure. Methods involved examine quantitative data of improvement in surgical skills. Herein, we qualitatively describe the method and show how program directors (PDs) may implement this technique in their residencies. CEVL allows an operation to be broken down into teachable components. The process relies on feedback and remediation to improve performance, with a focus on learning that is applicable to the next case being performed. CEVL has been shown to be effective for teaching pediatric orchiopexy and is being adapted to additional adult and pediatric procedures and to office examination skills. The CEVL method is available to other residency training programs.

  14. Computer-Enhanced Visual Learning Method: A Paradigm to Teach and Document Surgical Skills

    PubMed Central

    Maizels, Max; Mickelson, Jennie; Yerkes, Elizabeth; Maizels, Evelyn; Stork, Rachel; Young, Christine; Corcoran, Julia; Holl, Jane; Kaplan, William E.

    2009-01-01

    Innovation Changes in health care are stimulating residency training programs to develop new methods for teaching surgical skills. We developed Computer-Enhanced Visual Learning (CEVL) as an innovative Internet-based learning and assessment tool. The CEVL method uses the educational procedures of deliberate practice and performance to teach and learn surgery in a stylized manner. Aim of Innovation CEVL is a learning and assessment tool that can provide students and educators with quantitative feedback on learning a specific surgical procedure. Methods involved examine quantitative data of improvement in surgical skills. Herein, we qualitatively describe the method and show how program directors (PDs) may implement this technique in their residencies. Results CEVL allows an operation to be broken down into teachable components. The process relies on feedback and remediation to improve performance, with a focus on learning that is applicable to the next case being performed. CEVL has been shown to be effective for teaching pediatric orchiopexy and is being adapted to additional adult and pediatric procedures and to office examination skills. The CEVL method is available to other residency training programs. PMID:21975716

  15. Diagnostic accuracy of quantitative real-time PCR assay versus clinical and Gram stain identification of bacterial vaginosis.

    PubMed

    Menard, J-P; Mazouni, C; Fenollar, F; Raoult, D; Boubli, L; Bretelle, F

    2010-12-01

    The purpose of this investigation was to determine the diagnostic accuracy of quantitative real-time polymerase chain reaction (PCR) assay in diagnosing bacterial vaginosis versus the standard methods, the Amsel criteria and the Nugent score. The Amsel criteria, the Nugent score, and results from the molecular tool were obtained independently from vaginal samples of 163 pregnant women who reported abnormal vaginal symptoms before 20 weeks gestation. To determine the performance of the molecular tool, we calculated the kappa value, sensitivity, specificity, and positive and negative predictive values. Either or both of the Amsel criteria (≥3 criteria) and the Nugent score (score ≥7) indicated that 25 women (15%) had bacterial vaginosis, and the remaining 138 women did not. DNA levels of Gardnerella vaginalis or Atopobium vaginae exceeded 10(9) copies/mL or 10(8) copies/mL, respectively, in 34 (21%) of the 163 samples. Complete agreement between both reference methods and high concentrations of G. vaginalis and A. vaginae was found in 94.5% of women (154/163 samples, kappa value = 0.81, 95% confidence interval 0.70-0.81). The nine samples with discordant results were categorized as intermediate flora by the Nugent score. The molecular tool predicted bacterial vaginosis with a sensitivity of 100%, a specificity of 93%, a positive predictive value of 73%, and a negative predictive value of 100%. The quantitative real-time PCR assay shows excellent agreement with the results of both reference methods for the diagnosis of bacterial vaginosis.

  16. Development of National Program of Cancer Registries SAS Tool for Population-Based Cancer Relative Survival Analysis.

    PubMed

    Dong, Xing; Zhang, Kevin; Ren, Yuan; Wilson, Reda; O'Neil, Mary Elizabeth

    2016-01-01

    Studying population-based cancer survival by leveraging the high-quality cancer incidence data collected by the Centers for Disease Control and Prevention's National Program of Cancer Registries (NPCR) can offer valuable insight into the cancer burden and impact in the United States. We describe the development and validation of a SASmacro tool that calculates population-based cancer site-specific relative survival estimates comparable to those obtained through SEER*Stat. The NPCR relative survival analysis SAS tool (NPCR SAS tool) was developed based on the relative survival method and SAS macros developed by Paul Dickman. NPCR cancer incidence data from 25 states submitted in November 2012 were used, specifically cases diagnosed from 2003 to 2010 with follow-up through 2010. Decennial and annual complete life tables published by the National Center for Health Statistics (NCHS) for 2000 through 2009 were used. To assess comparability between the 2 tools, 5-year relative survival rates were calculated for 25 cancer sites by sex, race, and age group using the NPCR SAS tool and the National Cancer Institute's SEER*Stat 8.1.5 software. A module to create data files for SEER*Stat was also developed for the NPCR SAS tool. Comparison of the results produced by both SAS and SEER*Stat showed comparable and reliable relative survival estimates for NPCR data. For a majority of the sites, the net differences between the NPCR SAS tool and SEER*Stat-produced relative survival estimates ranged from -0.1% to 0.1%. The estimated standard errors were highly comparable between the 2 tools as well. The NPCR SAS tool will allow researchers to accurately estimate cancer 5-year relative survival estimates that are comparable to those produced by SEER*Stat for NPCR data. Comparison of output from the NPCR SAS tool and SEER*Stat provided additional quality control capabilities for evaluating data prior to producing NPCR relative survival estimates.

  17. Application of the denaturing gradient gel electrophoresis (DGGE) technique as an efficient diagnostic tool for ciliate communities in soil.

    PubMed

    Jousset, Alexandre; Lara, Enrique; Nikolausz, Marcell; Harms, Hauke; Chatzinotas, Antonis

    2010-02-01

    Ciliates (or Ciliophora) are ubiquitous organisms which can be widely used as bioindicators in ecosystems exposed to anthropogenic and industrial influences. The evaluation of the environmental impact on soil ciliate communities with methods relying on morphology-based identification may be hampered by the large number of samples usually required for a statistically supported, reliable conclusion. Cultivation-independent molecular-biological diagnostic tools are a promising alternative to greatly simplify and accelerate such studies. In this present work a ciliate-specific fingerprint method based on the amplification of a phylogenetic marker gene (i.e. the 18S ribosomal RNA gene) with subsequent analysis by denaturing gradient gel electrophoresis (DGGE) was developed and used to monitor community shifts in a polycyclic aromatic hydrocarbon (PAH) polluted soil. The semi-nested approach generated ciliate-specific amplification products from all soil samples and allowed to distinguish community profiles from a PAH-polluted and a non-polluted control soil. Subsequent sequence analysis of excised bands provided evidence that polluted soil samples are dominated by organisms belonging to the class Colpodea. The general DGGE approach presented in this study might thus in principle serve as a fast and reproducible diagnostic tool, complementing and facilitating future ecological and ecotoxicological monitoring of ciliates in polluted habitats. Copyright 2009 Elsevier B.V. All rights reserved.

  18. Evaluation of the efficacy of nutritional screening tools to predict malnutrition in the elderly at a geriatric care hospital

    PubMed Central

    Baek, Myoung-Ha

    2015-01-01

    BACKGROUND/OBJECTIVES Malnutrition in the elderly is a serious problem, prevalent in both hospitals and care homes. Due to the absence of a gold standard for malnutrition, herein we evaluate the efficacy of five nutritional screening tools developed or used for the elderly. SUBJECTS/METHODS Elected medical records of 141 elderly patients (86 men and 55 women, aged 73.5 ± 5.2 years) hospitalized at a geriatric care hospital were analyzed. Nutritional screening was performed using the following tools: Mini Nutrition Assessment (MNA), Mini Nutrition Assessment-Short Form (MNA-SF), Geriatric Nutritional Risk Index (GNRI), Malnutrition Universal Screening Tool (MUST) and Nutritional Risk Screening 2002 (NRS 2002). A combined index for malnutrition was also calculated as a reference tool. Each patient evaluated as malnourished to any degree or at risk of malnutrition according to at least four out of five of the aforementioned tools was categorized as malnourished in the combined index classification. RESULTS According to the combined index, 44.0% of the patients were at risk of malnutrition to some degree. While the nutritional risk and/or malnutrition varied greatly depending on the tool applied, ranging from 36.2% (MUST) to 72.3% (MNA-SF). MUST showed good validity (sensitivity 80.6%, specificity 98.7%) and almost perfect agreement (k = 0.81) with the combined index. In contrast, MNA-SF showed poor validity (sensitivity 100%, specificity 49.4%) and only moderate agreement (k = 0.46) with the combined index. CONCLUSIONS MNA-SF was found to overestimate the nutritional risk in the elderly. MUST appeared to be the most valid and useful screening tool to predict malnutrition in the elderly at a geriatric care hospital. PMID:26634053

  19. Assessing the effects of adsorptive polymeric resin additions on fungal secondary metabolite chemical diversity.

    PubMed

    González-Menéndez, Víctor; Asensio, Francisco; Moreno, Catalina; de Pedro, Nuria; Monteiro, Maria Candida; de la Cruz, Mercedes; Vicente, Francisca; Bills, Gerald F; Reyes, Fernando; Genilloud, Olga; Tormo, José R

    2014-07-03

    Adsorptive polymeric resins have been occasionally described to enhance the production of specific secondary metabolites (SMs) of interest. Methods that induce the expression of new chemical entities in fungal fermentations may lead to the discovery of new bioactive molecules and should be addressed as possible tools for the creation of new microbial chemical libraries for drug lead discovery. Herein, we apply both biological activity and chemical evaluations to assess the use of adsorptive resins as tools for the differential expression of SMs in fungal strain sets. Data automation approaches were applied to ultra high performance liquid chromatography analysis of extracts to evaluate the general influence in generating new chemical entities or in changing the production of specific SMs by fungi grown in the presence of resins and different base media.

  20. eLoom and Flatland: specification, simulation and visualization engines for the study of arbitrary hierarchical neural architectures.

    PubMed

    Caudell, Thomas P; Xiao, Yunhai; Healy, Michael J

    2003-01-01

    eLoom is an open source graph simulation software tool, developed at the University of New Mexico (UNM), that enables users to specify and simulate neural network models. Its specification language and libraries enables users to construct and simulate arbitrary, potentially hierarchical network structures on serial and parallel processing systems. In addition, eLoom is integrated with UNM's Flatland, an open source virtual environments development tool to provide real-time visualizations of the network structure and activity. Visualization is a useful method for understanding both learning and computation in artificial neural networks. Through 3D animated pictorially representations of the state and flow of information in the network, a better understanding of network functionality is achieved. ART-1, LAPART-II, MLP, and SOM neural networks are presented to illustrate eLoom and Flatland's capabilities.

  1. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    NASA Technical Reports Server (NTRS)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  2. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    PubMed

    Lausberg, Hedda; Sloetjes, Han

    2016-09-01

    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.

  3. Internal scanning method as unique imaging method of optical vortex scanning microscope

    NASA Astrophysics Data System (ADS)

    Popiołek-Masajada, Agnieszka; Masajada, Jan; Szatkowski, Mateusz

    2018-06-01

    The internal scanning method is specific for the optical vortex microscope. It allows to move the vortex point inside the focused vortex beam with nanometer resolution while the whole beam stays in place. Thus the sample illuminated by the focused vortex beam can be scanned just by the vortex point. We show that this method enables high resolution imaging. The paper presents the preliminary experimental results obtained with the first basic image recovery procedure. A prospect of developing more powerful tools for topography recovery with the optical vortex scanning microscope is discussed shortly.

  4. Development of a Novel Loop-Mediated Isothermal Amplification (LAMP) Assay for the Detection of Rickettsia spp.

    PubMed

    Hanaoka, Nozomu; Matsutani, Minenosuke; Satoh, Masaaki; Ogawa, Motohiko; Shirai, Mutsunori; Ando, Shuji

    2017-01-24

    We developed a novel loop-mediated isothermal amplification (LAMP) method to detect Rickettsia spp., including Rickettsia prowazekii and R. typhi. Species-specific LAMP primers were developed for orthologous genes conserved among Rickettsia spp. The selected modified primers could detect all the Rickettsia spp. tested. The LAMP method was successfully used to detect 100 DNA copies of Rickettsia spp. within approximately 60 min at 63℃. Therefore, this method may be an excellent tool for the early diagnosis of rickettsiosis in a laboratory or in the field.

  5. Optical Brain Imaging: A Powerful Tool for Neuroscience.

    PubMed

    Zhu, Xinpei; Xia, Yanfang; Wang, Xuecen; Si, Ke; Gong, Wei

    2017-02-01

    As the control center of organisms, the brain remains little understood due to its complexity. Taking advantage of imaging methods, scientists have found an accessible approach to unraveling the mystery of neuroscience. Among these methods, optical imaging techniques are widely used due to their high molecular specificity and single-molecule sensitivity. Here, we overview several optical imaging techniques in neuroscience of recent years, including brain clearing, the micro-optical sectioning tomography system, and deep tissue imaging.

  6. Microbial Ecology: Where are we now?

    PubMed

    Boughner, Lisa A; Singh, Pallavi

    2016-11-01

    Conventional microbiological methods have been readily taken over by newer molecular techniques due to the ease of use, reproducibility, sensitivity and speed of working with nucleic acids. These tools allow high throughput analysis of complex and diverse microbial communities, such as those in soil, freshwater, saltwater, or the microbiota living in collaboration with a host organism (plant, mouse, human, etc). For instance, these methods have been robustly used for characterizing the plant (rhizosphere), animal and human microbiome specifically the complex intestinal microbiota. The human body has been referred to as the Superorganism since microbial genes are more numerous than the number of human genes and are essential to the health of the host. In this review we provide an overview of the Next Generation tools currently available to study microbial ecology, along with their limitations and advantages.

  7. Molecular Tools for the Detection of Nitrogen Cycling Archaea

    PubMed Central

    Rusch, Antje

    2013-01-01

    Archaea are widespread in extreme and temperate environments, and cultured representatives cover a broad spectrum of metabolic capacities, which sets them up for potentially major roles in the biogeochemistry of their ecosystems. The detection, characterization, and quantification of archaeal functions in mixed communities require Archaea-specific primers or probes for the corresponding metabolic genes. Five pairs of degenerate primers were designed to target archaeal genes encoding key enzymes of nitrogen cycling: nitrite reductases NirA and NirB, nitrous oxide reductase (NosZ), nitrogenase reductase (NifH), and nitrate reductases NapA/NarG. Sensitivity towards their archaeal target gene, phylogenetic specificity, and gene specificity were evaluated in silico and in vitro. Owing to their moderate sensitivity/coverage, the novel nirB-targeted primers are suitable for pure culture studies only. The nirA-targeted primers showed sufficient sensitivity and phylogenetic specificity, but poor gene specificity. The primers designed for amplification of archaeal nosZ performed well in all 3 criteria; their discrimination against bacterial homologs appears to be weakened when Archaea are strongly outnumbered by bacteria in a mixed community. The novel nifH-targeted primers showed high sensitivity and gene specificity, but failed to discriminate against bacterial homologs. Despite limitations, 4 of the new primer pairs are suitable tools in several molecular methods applied in archaeal ecology. PMID:23365509

  8. Effects of Different Cutting Patterns and Experimental Conditions on the Performance of a Conical Drag Tool

    NASA Astrophysics Data System (ADS)

    Copur, Hanifi; Bilgin, Nuh; Balci, Cemal; Tumac, Deniz; Avunduk, Emre

    2017-06-01

    This study aims at determining the effects of single-, double-, and triple-spiral cutting patterns; the effects of tool cutting speeds on the experimental scale; and the effects of the method of yield estimation on cutting performance by performing a set of full-scale linear cutting tests with a conical cutting tool. The average and maximum normal, cutting and side forces; specific energy; yield; and coarseness index are measured and compared in each cutting pattern at a 25-mm line spacing, at varying depths of cut per revolution, and using two cutting speeds on five different rock samples. The results indicate that the optimum specific energy decreases by approximately 25% with an increasing number of spirals from the single- to the double-spiral cutting pattern for the hard rocks, whereas generally little effect was observed for the soft- and medium-strength rocks. The double-spiral cutting pattern appeared to be more effective than the single- or triple-spiral cutting pattern and had an advantage of lower side forces. The tool cutting speed had no apparent effect on the cutting performance. The estimation of the specific energy by the yield based on the theoretical swept area was not significantly different from that estimated by the yield based on the muck weighing, especially for the double- and triple-spiral cutting patterns and with the optimum ratio of line spacing to depth of cut per revolution. This study also demonstrated that the cutterhead and mechanical miner designs, semi-theoretical deterministic computer simulations and empirical performance predictions and optimization models should be based on realistic experimental simulations. Studies should be continued to obtain more reliable results by creating a larger database of laboratory tests and field performance records for mechanical miners using drag tools.

  9. Treatment default amongst patients with tuberculosis in urban Morocco: predicting and explaining default and post-default sputum smear and drug susceptibility results.

    PubMed

    Cherkaoui, Imad; Sabouni, Radia; Ghali, Iraqi; Kizub, Darya; Billioux, Alexander C; Bennani, Kenza; Bourkadi, Jamal Eddine; Benmamoun, Abderrahmane; Lahlou, Ouafae; Aouad, Rajae El; Dooley, Kelly E

    2014-01-01

    Public tuberculosis (TB) clinics in urban Morocco. Explore risk factors for TB treatment default and develop a prediction tool. Assess consequences of default, specifically risk for transmission or development of drug resistance. Case-control study comparing patients who defaulted from TB treatment and patients who completed it using quantitative methods and open-ended questions. Results were interpreted in light of health professionals' perspectives from a parallel study. A predictive model and simple tool to identify patients at high risk of default were developed. Sputum from cases with pulmonary TB was collected for smear and drug susceptibility testing. 91 cases and 186 controls enrolled. Independent risk factors for default included current smoking, retreatment, work interference with adherence, daily directly observed therapy, side effects, quick symptom resolution, and not knowing one's treatment duration. Age >50 years, never smoking, and having friends who knew one's diagnosis were protective. A simple scoring tool incorporating these factors was 82.4% sensitive and 87.6% specific for predicting default in this population. Clinicians and patients described additional contributors to default and suggested locally-relevant intervention targets. Among 89 cases with pulmonary TB, 71% had sputum that was smear positive for TB. Drug resistance was rare. The causes of default from TB treatment were explored through synthesis of qualitative and quantitative data from patients and health professionals. A scoring tool with high sensitivity and specificity to predict default was developed. Prospective evaluation of this tool coupled with targeted interventions based on our findings is warranted. Of note, the risk of TB transmission from patients who default treatment to others is likely to be high. The commonly-feared risk of drug resistance, though, may be low; a larger study is required to confirm these findings.

  10. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  11. Patient Data Synchronization Process in a Continuity of Care Environment

    PubMed Central

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  12. Optogenetics in a transparent animal: circuit function in the larval zebrafish.

    PubMed

    Portugues, Ruben; Severi, Kristen E; Wyart, Claire; Ahrens, Misha B

    2013-02-01

    Optogenetic tools can be used to manipulate neuronal activity in a reversible and specific manner. In recent years, such methods have been applied to uncover causal relationships between activity in specified neuronal circuits and behavior in the larval zebrafish. In this small, transparent, genetic model organism, noninvasive manipulation and monitoring of neuronal activity with light is possible throughout the nervous system. Here we review recent work in which these new tools have been applied to zebrafish, and discuss some of the existing challenges of these approaches. Copyright © 2012. Published by Elsevier Ltd.

  13. Progress towards Rapid Detection of Measles Vaccine Strains: a Tool To Inform Public Health Interventions

    PubMed Central

    2016-01-01

    ABSTRACT Rapid differentiation of vaccine from wild-type strains in suspect measles cases is a valuable epidemiological tool that informs the public health response to this highly infectious disease. Few public health laboratories sequence measles virus-positive specimens to determine genotype, and the vaccine-specific real-time reverse transcriptase PCR (rRT-PCR) assay described by F. Roy et al. (J. Clin. Microbiol. 55:735–743, 2017, https://doi.org/10.1128/JCM.01879-16) offers a rapid, easily adoptable method to identify measles vaccine strains in suspect cases. PMID:28003421

  14. Progress towards Rapid Detection of Measles Vaccine Strains: a Tool To Inform Public Health Interventions.

    PubMed

    Hacker, Jill K

    2017-03-01

    Rapid differentiation of vaccine from wild-type strains in suspect measles cases is a valuable epidemiological tool that informs the public health response to this highly infectious disease. Few public health laboratories sequence measles virus-positive specimens to determine genotype, and the vaccine-specific real-time reverse transcriptase PCR (rRT-PCR) assay described by F. Roy et al. (J. Clin. Microbiol. 55:735-743, 2017, https://doi.org/10.1128/JCM.01879-16) offers a rapid, easily adoptable method to identify measles vaccine strains in suspect cases. Copyright © 2017 American Society for Microbiology.

  15. Probability-based estimates of site-specific copper water quality criteria for the Chesapeake Bay, USA.

    PubMed

    Arnold, W Ray; Warren-Hicks, William J

    2007-01-01

    The object of this study was to estimate site- and region-specific dissolved copper criteria for a large embayment, the Chesapeake Bay, USA. The intent is to show the utility of 2 copper saltwater quality site-specific criteria estimation models and associated region-specific criteria selection methods. The criteria estimation models and selection methods are simple, efficient, and cost-effective tools for resource managers. The methods are proposed as potential substitutes for the US Environmental Protection Agency's water effect ratio methods. Dissolved organic carbon data and the copper criteria models were used to produce probability-based estimates of site-specific copper saltwater quality criteria. Site- and date-specific criteria estimations were made for 88 sites (n = 5,296) in the Chesapeake Bay. The average and range of estimated site-specific chronic dissolved copper criteria for the Chesapeake Bay were 7.5 and 5.3 to 16.9 microg Cu/L. The average and range of estimated site-specific acute dissolved copper criteria for the Chesapeake Bay were 11.7 and 8.3 to 26.4 microg Cu/L. The results suggest that applicable national and state copper criteria can increase in much of the Chesapeake Bay and remain protective. Virginia Department of Environmental Quality copper criteria near the mouth of the Chesapeake Bay, however, need to decrease to protect species of equal or greater sensitivity to that of the marine mussel, Mytilus sp.

  16. CRISPR Mediated Genome Engineering and its Application in Industry.

    PubMed

    Kaboli, Saeed; Babazada, Hasan

    2018-01-01

    The CRISPR (clustered regularly interspaced short palindromic repeat)-Cas9 (CRISPR-associated nuclease 9) method has been dramatically changing the field of genome engineering. It is a rapid, highly efficient and versatile tool for precise modification of genome that uses a guide RNA (gRNA) to target Cas9 to a specific sequence. This novel RNA-guided genome-editing technique has become a revolutionary tool in biomedical science and has many innovative applications in different fields. In this review, we briefly introduce the Cas9-mediated genome-editing tool, summarize the recent advances in CRISPR/Cas9 technology to engineer the genomes of a wide variety of organisms, and discuss their applications to treatment of fungal and viral disease. We also discuss advantageous of CRISPR/Cas9 technology to drug design, creation of animal model, and to food, agricultural and energy sciences. Adoption of the CRISPR/Cas9 technology in biomedical and biotechnological researches would create innovative applications of it not only for breeding of strains exhibiting desired traits for specific industrial and medical applications, but also for investigation of genome function.

  17. Understanding the behavior of Giardia and Cryptosporidium in an urban watershed: Explanation and application of techniques to collect and evaluate monitoring data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crockett, C.S.; Haas, C.N.

    1996-11-01

    Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less

  18. Impact of gastrointestinal parasitic nematodes of sheep, and the role of advanced molecular tools for exploring epidemiology and drug resistance - an Australian perspective

    PubMed Central

    2013-01-01

    Parasitic nematodes (roundworms) of small ruminants and other livestock have major economic impacts worldwide. Despite the impact of the diseases caused by these nematodes and the discovery of new therapeutic agents (anthelmintics), there has been relatively limited progress in the development of practical molecular tools to study the epidemiology of these nematodes. Specific diagnosis underpins parasite control, and the detection and monitoring of anthelmintic resistance in livestock parasites, presently a major concern around the world. The purpose of the present article is to provide a concise account of the biology and knowledge of the epidemiology of the gastrointestinal nematodes (order Strongylida), from an Australian perspective, and to emphasize the importance of utilizing advanced molecular tools for the specific diagnosis of nematode infections for refined investigations of parasite epidemiology and drug resistance detection in combination with conventional methods. It also gives a perspective on the possibility of harnessing genetic, genomic and bioinformatic technologies to better understand parasites and control parasitic diseases. PMID:23711194

  19. [Diagnostic tools for canine parvovirus infection].

    PubMed

    Proksch, A L; Hartmann, K

    2015-01-01

    Canine parvovirus (CPV) infection is one of the most important and common infectious diseases in dogs, in particular affecting young puppies when maternal antibodies have waned and vaccine-induced antibodies have not yet developed. The mortality rate remains high. Therefore, a rapid and safe diagnostic tool is essential to diagnose the disease to 1) provide intensive care treatment and 2) to identify virus-shedding animals and thus prevent virus spread. Whilst the detection of antibodies against CPV is considered unsuitable to diagnose the disease, there are several different methods to directly detect complete virus, virus antigen or DNA. Additionally, to test in commercial laboratories, rapid in-house tests based on ELISA are available worldwide. The specificity of the ELISA rapid in-house tests is reported to be excellent. However, results on sensitivity vary and high numbers of false-negative results are commonly reported, which potentially leads to misdiagnosis. Polymerase chain reaction (PCR) is a very sensitive and specific diagnostic tool. It also provides the opportunity to differentiate vaccine strains from natural infection when sequencing is performed after PCR.

  20. Comprehension Tools for Teachers: Reading for Understanding from Prekindergarten through Fourth Grade

    PubMed Central

    Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.

    2015-01-01

    This paper describes the theoretical framework, as well as the development and testing of the intervention, Comprehension Tools for Teachers (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for understanding for prekindergarteners through fourth graders. Component interventions target processes considered largely automatic as well as more reflective processes, with interacting and reciprocal effects. Specifically, we present component interventions targeting cognitive, linguistic, and text-specific processes, including morphological awareness, syntax, mental-state verbs, comprehension monitoring, narrative and expository text structure, enacted comprehension, academic knowledge, and reading to learn from informational text. Our aim was to develop a tool set composed of intensive meaningful individualized small group interventions. We improved feasibility in regular classrooms through the use of design-based iterative research methods including careful lesson planning, targeted scripting, pre- and postintervention proximal assessments, and technology. In addition to the overall framework, we discuss seven of the component interventions and general results of design and efficacy studies. PMID:26500420

  1. Benefits of object-oriented models and ModeliChart: modern tools and methods for the interdisciplinary research on smart biomedical technology.

    PubMed

    Gesenhues, Jonas; Hein, Marc; Ketelhut, Maike; Habigt, Moriz; Rüschen, Daniel; Mechelinck, Mare; Albin, Thivaharan; Leonhardt, Steffen; Schmitz-Rode, Thomas; Rossaint, Rolf; Autschbach, Rüdiger; Abel, Dirk

    2017-04-01

    Computational models of biophysical systems generally constitute an essential component in the realization of smart biomedical technological applications. Typically, the development process of such models is characterized by a great extent of collaboration between different interdisciplinary parties. Furthermore, due to the fact that many underlying mechanisms and the necessary degree of abstraction of biophysical system models are unknown beforehand, the steps of the development process of the application are iteratively repeated when the model is refined. This paper presents some methods and tools to facilitate the development process. First, the principle of object-oriented (OO) modeling is presented and the advantages over classical signal-oriented modeling are emphasized. Second, our self-developed simulation tool ModeliChart is presented. ModeliChart was designed specifically for clinical users and allows independently performing in silico studies in real time including intuitive interaction with the model. Furthermore, ModeliChart is capable of interacting with hardware such as sensors and actuators. Finally, it is presented how optimal control methods in combination with OO models can be used to realize clinically motivated control applications. All methods presented are illustrated on an exemplary clinically oriented use case of the artificial perfusion of the systemic circulation.

  2. An improved method for detecting circulating microRNAs with S-Poly(T) Plus real-time PCR

    PubMed Central

    Niu, Yanqin; Zhang, Limin; Qiu, Huiling; Wu, Yike; Wang, Zhiwei; Zai, Yujia; Liu, Lin; Qu, Junle; Kang, Kang; Gou, Deming

    2015-01-01

    We herein describe a simple, sensitive and specific method for analysis of circulating microRNAs (miRNA), termed S-Poly(T) Plus real-time PCR assay. This new method is based on our previously developed S-Poly(T) method, in which a unique S-Poly(T) primer is used during reverse-transcription to increase sensitivity and specificity. Further increased sensitivity and simplicity of S-Poly(T) Plus, in comparison with the S-Poly(T) method, were achieved by a single-step, multiple-stage reaction, where RNAs were polyadenylated and reverse-transcribed at the same time. The sensitivity of circulating miRNA detection was further improved by a modified method of total RNA isolation from serum/plasma, S/P miRsol, in which glycogen was used to increase the RNA yield. We validated our methods by quantifying miRNA expression profiles in the sera of the patients with pulmonary arterial hypertension associated with congenital heart disease. In conclusion, we developed a simple, sensitive, and specific method for detecting circulating miRNAs that allows the measurement of 266 miRNAs from 100 μl of serum or plasma. This method presents a promising tool for basic miRNA research and clinical diagnosis of human diseases based on miRNA biomarkers. PMID:26459910

  3. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  4. Rapid ABO genotyping by high-speed droplet allele-specific PCR using crude samples.

    PubMed

    Taira, Chiaki; Matsuda, Kazuyuki; Takeichi, Naoya; Furukawa, Satomi; Sugano, Mitsutoshi; Uehara, Takeshi; Okumura, Nobuo; Honda, Takayuki

    2018-01-01

    ABO genotyping has common tools for personal identification of forensic and transplantation field. We developed a new method based on a droplet allele-specific PCR (droplet-AS-PCR) that enabled rapid PCR amplification. We attempted rapid ABO genotyping using crude DNA isolated from dried blood and buccal cells. We designed allele-specific primers for three SNPs (at nucleotides 261, 526, and 803) in exons 6 and 7 of the ABO gene. We pretreated dried blood and buccal cells with proteinase K, and obtained crude DNAs without DNA purification. Droplet-AS-PCR allowed specific amplification of the SNPs at the three loci using crude DNA, with results similar to those for DNA extracted from fresh peripheral blood. The sensitivity of the methods was 5%-10%. The genotyping of extracted DNA and crude DNA were completed within 8 and 9 minutes, respectively. The genotypes determined by the droplet-AS-PCR method were always consistent with those obtained by direct sequencing. The droplet-AS-PCR method enabled rapid and specific amplification of three SNPs of the ABO gene from crude DNA treated with proteinase K. ABO genotyping by the droplet-AS-PCR has the potential to be applied to various fields including a forensic medicine and transplantation medical care. © 2017 Wiley Periodicals, Inc.

  5. Computational diagnosis of canine lymphoma

    NASA Astrophysics Data System (ADS)

    Mirkes, E. M.; Alexandrakis, I.; Slater, K.; Tuli, R.; Gorban, A. N.

    2014-03-01

    One out of four dogs will develop cancer in their lifetime and 20% of those will be lymphoma cases. PetScreen developed a lymphoma blood test using serum samples collected from several veterinary practices. The samples were fractionated and analysed by mass spectrometry. Two protein peaks, with the highest diagnostic power, were selected and further identified as acute phase proteins, C-Reactive Protein and Haptoglobin. Data mining methods were then applied to the collected data for the development of an online computer-assisted veterinary diagnostic tool. The generated software can be used as a diagnostic, monitoring and screening tool. Initially, the diagnosis of lymphoma was formulated as a classification problem and then later refined as a lymphoma risk estimation. Three methods, decision trees, kNN and probability density evaluation, were used for classification and risk estimation and several preprocessing approaches were implemented to create the diagnostic system. For the differential diagnosis the best solution gave a sensitivity and specificity of 83.5% and 77%, respectively (using three input features, CRP, Haptoglobin and standard clinical symptom). For the screening task, the decision tree method provided the best result, with sensitivity and specificity of 81.4% and >99%, respectively (using the same input features). Furthermore, the development and application of new techniques for the generation of risk maps allowed their user-friendly visualization.

  6. Measurement of community empowerment in three community programs in Rapla (Estonia).

    PubMed

    Kasmel, Anu; Andersen, Pernille Tanggaard

    2011-03-01

    Community empowerment approaches have been proven to be powerful tools for solving local health problems. However, the methods for measuring empowerment in the community remain unclear and open to dispute. This study aims to describe how a context-specific community empowerment measurement tool was developed and changes made to three health promotion programs in Rapla, Estonia. An empowerment expansion model was compiled and applied to three existing programs: Safe Community, Drug/HIV Prevention and Elderly Quality of Life. The consensus workshop method was used to create the measurement tool and collect data on the Organizational Domains of Community Empowerment (ODCE). The study demonstrated considerable increases in the ODCE among the community workgroup, which was initiated by community members and the municipality's decision-makers. The increase was within the workgroup, which had strong political and financial support on a national level but was not the community's priority. The program was initiated and implemented by the local community members, and continuous development still occurred, though at a reduced pace. The use of the empowerment expansion model has proven to be an applicable, relevant, simple and inexpensive tool for the evaluation of community empowerment.

  7. The Virtual Learning Commons (VLC): Enabling Co-Innovation Across Disciplines

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gandara, A.; Del Rio, N.

    2014-12-01

    A key challenge for scientists addressing grand-challenge problems is identifying, understanding, and integrating potentially relevant methods, models and tools that that are rapidly evolving in the informatics community. Such tools are essential for effectively integrating data and models in complex research projects, yet it is often difficult to know what tools are available and it is not easy to understand or evaluate how they might be used in a given research context. The goal of the National Science Foundation-funded Virtual Learning Commons (VLC) is to improve awareness and understanding of emerging methodologies and technologies, facilitate individual and group evaluation of these, and trace the impact of innovations within and across teams, disciplines, and communities. The VLC is a Web-based social bookmarking site designed specifically to support knowledge exchange in research communities. It is founded on well-developed models of technology adoption, diffusion of innovation, and experiential learning. The VLC makes use of Web 2.0 (Social Web) and Web 3.0 (Semantic Web) approaches. Semantic Web approaches enable discovery of potentially relevant methods, models, and tools, while Social Web approaches enable collaborative learning about their function. The VLC is under development and the first release is expected Fall 2014.

  8. Numerical tool development of fluid-structure interactions for investigation of obstructive sleep apnea

    NASA Astrophysics Data System (ADS)

    Huang, Chien-Jung; White, Susan; Huang, Shao-Ching; Mallya, Sanjay; Eldredge, Jeff

    2016-11-01

    Obstructive sleep apnea (OSA) is a medical condition characterized by repetitive partial or complete occlusion of the airway during sleep. The soft tissues in the upper airway of OSA patients are prone to collapse under the low pressure loads incurred during breathing. The ultimate goal of this research is the development of a versatile numerical tool for simulation of air-tissue interactions in the patient specific upper airway geometry. This tool is expected to capture several phenomena, including flow-induced vibration (snoring) and large deformations during airway collapse of the complex airway geometry in respiratory flow conditions. Here, we present our ongoing progress toward this goal. To avoid mesh regeneration, for flow model, a sharp-interface embedded boundary method is used on Cartesian grids for resolving the fluid-structure interface, while for the structural model, a cut-cell finite element method is used. Also, to properly resolve large displacements, non-linear elasticity model is used. The fluid and structure solvers are connected with the strongly coupled iterative algorithm. The parallel computation is achieved with the numerical library PETSc. Some two- and three- dimensional preliminary results are shown to demonstrate the ability of this tool.

  9. Measurement of Community Empowerment in Three Community Programs in Rapla (Estonia)

    PubMed Central

    Kasmel, Anu; Andersen, Pernille Tanggaard

    2011-01-01

    Community empowerment approaches have been proven to be powerful tools for solving local health problems. However, the methods for measuring empowerment in the community remain unclear and open to dispute. This study aims to describe how a context-specific community empowerment measurement tool was developed and changes made to three health promotion programs in Rapla, Estonia. An empowerment expansion model was compiled and applied to three existing programs: Safe Community, Drug/HIV Prevention and Elderly Quality of Life. The consensus workshop method was used to create the measurement tool and collect data on the Organizational Domains of Community Empowerment (ODCE). The study demonstrated considerable increases in the ODCE among the community workgroup, which was initiated by community members and the municipality’s decision-makers. The increase was within the workgroup, which had strong political and financial support on a national level but was not the community’s priority. The program was initiated and implemented by the local community members, and continuous development still occurred, though at a reduced pace. The use of the empowerment expansion model has proven to be an applicable, relevant, simple and inexpensive tool for the evaluation of community empowerment. PMID:21556179

  10. Towards high-throughput molecular detection of Plasmodium: new approaches and molecular markers

    PubMed Central

    Steenkeste, Nicolas; Incardona, Sandra; Chy, Sophy; Duval, Linda; Ekala, Marie-Thérèse; Lim, Pharath; Hewitt, Sean; Sochantha, Tho; Socheat, Doung; Rogier, Christophe; Mercereau-Puijalon, Odile; Fandeur, Thierry; Ariey, Frédéric

    2009-01-01

    Background Several strategies are currently deployed in many countries in the tropics to strengthen malaria control toward malaria elimination. To measure the impact of any intervention, there is a need to detect malaria properly. Mostly, decisions still rely on microscopy diagnosis. But sensitive diagnosis tools enabling to deal with a large number of samples are needed. The molecular detection approach offers a much higher sensitivity, and the flexibility to be automated and upgraded. Methods Two new molecular methods were developed: dot18S, a Plasmodium-specific nested PCR based on the 18S rRNA gene followed by dot-blot detection of species by using species-specific probes and CYTB, a Plasmodium-specific nested PCR based on cytochrome b gene followed by species detection using SNP analysis. The results were compared to those obtained with microscopic examination and the "standard" 18S rRNA gene based nested PCR using species specific primers. 337 samples were diagnosed. Results Compared to the microscopy the three molecular methods were more sensitive, greatly increasing the estimated prevalence of Plasmodium infection, including P. malariae and P. ovale. A high rate of mixed infections was uncovered with about one third of the villagers infected with more than one malaria parasite species. Dot18S and CYTB sensitivity outranged the "standard" nested PCR method, CYTB being the most sensitive. As a consequence, compared to the "standard" nested PCR method for the detection of Plasmodium spp., the sensitivity of dot18S and CYTB was respectively 95.3% and 97.3%. Consistent detection of Plasmodium spp. by the three molecular methods was obtained for 83% of tested isolates. Contradictory results were mostly related to detection of Plasmodium malariae and Plasmodium ovale in mixed infections, due to an "all-or-none" detection effect at low-level parasitaemia. Conclusion A large reservoir of asymptomatic infections was uncovered using the molecular methods. Dot18S and CYTB, the new methods reported herein are highly sensitive, allow parasite DNA extraction as well as genus- and species-specific diagnosis of several hundreds of samples, and are amenable to high-throughput scaling up for larger sample sizes. Such methods provide novel information on malaria prevalence and epidemiology and are suited for active malaria detection. The usefulness of such sensitive malaria diagnosis tools, especially in low endemic areas where eradication plans are now on-going, is discussed in this paper. PMID:19402894

  11. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  12. A GIS-assisted regional screening tool to evaluate the leaching potential of volatile and non-volatile pesticides

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Ray, Chittaranjan

    2015-03-01

    A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.

  13. Network model of project "Lean Production"

    NASA Astrophysics Data System (ADS)

    Khisamova, E. D.

    2018-05-01

    Economical production implies primarily new approaches to culture of management and organization of production and offers a set of tools and techniques that allows reducing losses significantly and making the process cheaper and faster. Economical production tools are simple solutions that allow one to see opportunities for improvement of all aspects of the business, to reduce losses significantly, to constantly improve the whole spectrum of business processes, to increase significantly the transparency and manageability of the organization, to take advantage of the potential of each employee of the company, to increase competitiveness, and to obtain significant economic benefits without making large financial expenditures. Each of economical production tools solves a specific part of the problems, and only application of their combination will allow one to solve the problem or minimize it to acceptable values. The research of the governance process project "Lean Production" permitted studying the methods and tools of lean production and developing measures for their improvement.

  14. Performance and Sizing Tool for Quadrotor Biplane Tailsitter UAS

    NASA Astrophysics Data System (ADS)

    Strom, Eric

    The Quadrotor-Biplane-Tailsitter (QBT) configuration is the basis for a mechanically simplistic rotorcraft capable of both long-range, high-speed cruise as well as hovering flight. This work presents the development and validation of a set of preliminary design tools built specifically for this aircraft to enable its further development, including: a QBT weight model, preliminary sizing framework, and vehicle analysis tools. The preliminary sizing tool presented here shows the advantage afforded by QBT designs in missions with aggressive cruise requirements, such as offshore wind turbine inspections, wherein transition from a quadcopter configuration to a QBT allows for a 5:1 trade of battery weight for wing weight. A 3D, unsteady panel method utilizing a nonlinear implementation of the Kutta-Joukowsky condition is also presented as a means of computing aerodynamic interference effects and, through the implementation of rotor, body, and wing geometry generators, is prepared for coupling with a comprehensive rotor analysis package.

  15. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  16. Acoustic Emission Methodology to Evaluate the Fracture Toughness in Heat Treated AISI D2 Tool Steel

    NASA Astrophysics Data System (ADS)

    Mostafavi, Sajad; Fotouhi, Mohamad; Motasemi, Abed; Ahmadi, Mehdi; Sindi, Cevat Teymuri

    2012-10-01

    In this article, fracture toughness behavior of tool steel was investigated using Acoustic Emission (AE) monitoring. Fracture toughness ( K IC) values of a specific tool steel was determined by applying various approaches based on conventional AE parameters, such as Acoustic Emission Cumulative Count (AECC), Acoustic Emission Energy Rate (AEER), and the combination of mechanical characteristics and AE information called sentry function. The critical fracture toughness values during crack propagation were achieved by means of relationship between the integral of the sentry function and cumulative fracture toughness (KICUM). Specimens were selected from AISI D2 cold-work tool steel and were heat treated at four different tempering conditions (300, 450, 525, and 575 °C). The results achieved through AE approaches were then compared with a methodology proposed by compact specimen testing according to ASTM standard E399. It was concluded that AE information was an efficient method to investigate fracture characteristics.

  17. A manifold independent approach to understanding transport in stochastic dynamical systems

    NASA Astrophysics Data System (ADS)

    Bollt, Erik M.; Billings, Lora; Schwartz, Ira B.

    2002-12-01

    We develop a new collection of tools aimed at studying stochastically perturbed dynamical systems. Specifically, in the setting of bi-stability, that is a two-attractor system, it has previously been numerically observed that a small noise volume is sufficient to destroy would be zero-noise case barriers in the phase space (pseudo-barriers), thus creating a pre-heteroclinic tangency chaos-like behavior. The stochastic dynamical system has a corresponding Frobenius-Perron operator with a stochastic kernel, which describes how densities of initial conditions move under the noisy map. Thus in studying the action of the Frobenius-Perron operator, we learn about the transport of the map; we have employed a Galerkin-Ulam-like method to project the Frobenius-Perron operator onto a discrete basis set of characteristic functions to highlight this action localized in specified regions of the phase space. Graph theoretic methods allow us to re-order the resulting finite dimensional Markov operator approximation so as to highlight the regions of the original phase space which are particularly active pseudo-barriers of the stochastic dynamics. Our toolbox allows us to find: (1) regions of high activity of transport, (2) flux across pseudo-barriers, and also (3) expected time of escape from pseudo-basins. Some of these quantities are also possible via the manifold dependent stochastic Melnikov method, but Melnikov only applies to a very special class of models for which the unperturbed homoclinic orbit is available. Our methods are unique in that they can essentially be considered as a “black-box” of tools which can be applied to a wide range of stochastic dynamical systems in the absence of a priori knowledge of manifold structures. We use here a model of childhood diseases to showcase our methods. Our tools will allow us to make specific observations of: (1) loss of reducibility between basins with increasing noise, (2) identification in the phase space of active regions of stochastic transport, (3) stochastic flux which essentially completes the heteroclinic tangle.

  18. Utilization of teledentistry as a tool to screen for dental caries among 12-year-old school children in a rural region of India.

    PubMed

    Purohit, Bharathi M; Singh, Abhinav; Dwivedi, Ashish

    2017-03-01

    The study aims to assess the reliability of video-graphic method as a tool to screen the dental caries among 12-year-old school children in a rural region of India. A total of 139 school children participated in the study. Visual tactile examinations were conducted using the Decayed, Missing, and Filled Teeth (DMFT) index. Simultaneously, standardized video recording of the oral cavity was performed. Sensitivity and specificity values were calculated for video-graphic assessment of dental caries. Bland-Altman plot was used to assess agreement between the two methods of caries assessment. Likelihood ratio (LR) and receiver-operating characteristic (ROC) curve were used to assess the predictive accuracy of the video-graphic method. Mean DMFT for the study population was 2.47 ± 2.01 and 2.46 ± 1.91 by visual tactile and video-graphic assessment (P = 0.76; > 0.05). Sensitivity and specificity values of 0.86 and 0.58 were established for video-graphic assessment. A fair degree of agreement was noted between the two methods with Intraclass correlation coefficient (ICC) value of 0.56. LR for video-graphic assessment was 2.05. Bland-Altman plot confirmed the level of agreement between the two assessment methods. The area under curve was 0.69 (CI 0.57, 0.80, P = 0.001). Teledentistry examination is comparable to clinical examination when screening for dental caries among school children. This study provides evidence that teledentistry may be used as an alternative screening tool for assessment of dental caries and is viable for remote consultation and treatment planning. Teledentistry offers to change the dynamics of dental care delivery and may effectively bridge the rural-urban oral health divide. © 2016 American Association of Public Health Dentistry.

  19. Screening and assessment tools for pediatric malnutrition.

    PubMed

    Huysentruyt, Koen; Vandenplas, Yvan; De Schepper, Jean

    2016-06-18

    The ideal measures for screening and assessing undernutrition in children remain a point of discussion in literature. This review aims to provide an overview of recent advances in the nutritional screening and assessment methods in children. This review focuses on two major topics that emerged in literature since 2015: the practical endorsement of the new definition for pediatric undernutrition, with a focus on anthropometric measurements and the search for a consensus on pediatric nutritional screening tools in different settings. Few analytical tools exist for the assessment of the nutritional status in children. The subjective global nutritional assessment has been validated by anthropometric as well as clinical outcome parameters. Nutritional screening can help in selecting patients that benefit the most from a full nutritional assessment. Two new screening tools have been developed for use in a general (mixed) hospital population, and one for a population of children with cancer. The value of screening tools in different disease-specific and outpatient pediatric populations remains to be proven.

  20. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  1. Adaptable gene-specific dye bias correction for two-channel DNA microarrays.

    PubMed

    Margaritis, Thanasis; Lijnzaad, Philip; van Leenen, Dik; Bouwmeester, Diane; Kemmeren, Patrick; van Hooff, Sander R; Holstege, Frank C P

    2009-01-01

    DNA microarray technology is a powerful tool for monitoring gene expression or for finding the location of DNA-bound proteins. DNA microarrays can suffer from gene-specific dye bias (GSDB), causing some probes to be affected more by the dye than by the sample. This results in large measurement errors, which vary considerably for different probes and also across different hybridizations. GSDB is not corrected by conventional normalization and has been difficult to address systematically because of its variance. We show that GSDB is influenced by label incorporation efficiency, explaining the variation of GSDB across different hybridizations. A correction method (Gene- And Slide-Specific Correction, GASSCO) is presented, whereby sequence-specific corrections are modulated by the overall bias of individual hybridizations. GASSCO outperforms earlier methods and works well on a variety of publically available datasets covering a range of platforms, organisms and applications, including ChIP on chip. A sequence-based model is also presented, which predicts which probes will suffer most from GSDB, useful for microarray probe design and correction of individual hybridizations. Software implementing the method is publicly available.

  2. Adaptable gene-specific dye bias correction for two-channel DNA microarrays

    PubMed Central

    Margaritis, Thanasis; Lijnzaad, Philip; van Leenen, Dik; Bouwmeester, Diane; Kemmeren, Patrick; van Hooff, Sander R; Holstege, Frank CP

    2009-01-01

    DNA microarray technology is a powerful tool for monitoring gene expression or for finding the location of DNA-bound proteins. DNA microarrays can suffer from gene-specific dye bias (GSDB), causing some probes to be affected more by the dye than by the sample. This results in large measurement errors, which vary considerably for different probes and also across different hybridizations. GSDB is not corrected by conventional normalization and has been difficult to address systematically because of its variance. We show that GSDB is influenced by label incorporation efficiency, explaining the variation of GSDB across different hybridizations. A correction method (Gene- And Slide-Specific Correction, GASSCO) is presented, whereby sequence-specific corrections are modulated by the overall bias of individual hybridizations. GASSCO outperforms earlier methods and works well on a variety of publically available datasets covering a range of platforms, organisms and applications, including ChIP on chip. A sequence-based model is also presented, which predicts which probes will suffer most from GSDB, useful for microarray probe design and correction of individual hybridizations. Software implementing the method is publicly available. PMID:19401678

  3. BrAD-seq: Breath Adapter Directional sequencing: a streamlined, ultra-simple and fast library preparation protocol for strand specific mRNA library construction.

    PubMed

    Townsley, Brad T; Covington, Michael F; Ichihashi, Yasunori; Zumstein, Kristina; Sinha, Neelima R

    2015-01-01

    Next Generation Sequencing (NGS) is driving rapid advancement in biological understanding and RNA-sequencing (RNA-seq) has become an indispensable tool for biology and medicine. There is a growing need for access to these technologies although preparation of NGS libraries remains a bottleneck to wider adoption. Here we report a novel method for the production of strand specific RNA-seq libraries utilizing the terminal breathing of double-stranded cDNA to capture and incorporate a sequencing adapter. Breath Adapter Directional sequencing (BrAD-seq) reduces sample handling and requires far fewer enzymatic steps than most available methods to produce high quality strand-specific RNA-seq libraries. The method we present is optimized for 3-prime Digital Gene Expression (DGE) libraries and can easily extend to full transcript coverage shotgun (SHO) type strand-specific libraries and is modularized to accommodate a diversity of RNA and DNA input materials. BrAD-seq offers a highly streamlined and inexpensive option for RNA-seq libraries.

  4. Education in the workplace for the physician: clinical management states as an organizing framework.

    PubMed

    Greenes, R A

    2000-01-01

    Medical educators are interested in approaches to making selected relevant knowledge available in the context of problem-based care. This is of value both during the process of care and as a means of organizing information for offline self-study. Four trends in health information technology are relevant to achieving the goal and can be expected to play a growing role in the future. First, health care enterprises are developing approaches for access to information resources related to the care of a patient, including clinical data and images but also communication tools, referral and other logistic tools, decision support, and educational materials. Second, information for patients and methods for patient-doctor interaction and decision making are becoming available. Third, computer-based methods for representation of practice guidelines are being developed to support applications that can incorporate their logic. Finally, considering patients as being in particular "clinical management states" (or CMSs) for specific problems, approaches are being developed to use guidelines as a kind of "predictive" framework to enable development of interfaces for problem-based clinical encounters. The guidelines for a CMS can be used to identify the kinds of resources specifically needed for clinical encounters of that type. As the above trends converge to produce problem-specific environments, professional specialty organizations and continuing medical education course designers will need to focus energies on organizing and updating medical knowledge to make it available in CMS-specific contexts.

  5. The sensitivity and specificity of Lassa virus IgM by ELISA as screening tool at early phase of Lassa fever infection

    PubMed Central

    Ibekwe, Titus S.; Nwegbu, Maxwell M.; Asogun, Daniel; Adomeh, Donatus I.; Okokhere, Peter O.

    2012-01-01

    Background: Early diagnosis, prompt treatment, and disease containment are vital measures in the management of Lassa fever (LF), a lethal and contagious arenaviral hemorrhagic disease prevalent in West Africa. Lassa Virus (LAV)-specific Reverse Transcriptase Polymerase Chain Reaction (RT-PCR) test, the gold standard for diagnosis, is unavailable in most centers. Serologic detection of LAV IgM is a more accessible tool and this work was to investigate its adequacy as an early marker for LF. Patients and Methods: A prospective case–control study conducted July 2007-March 2011 in a tertiary referral health center in Nigeria. Blood samples for test and control were evaluated for Lassa specific antigens and IgM using RT-PCR (primers S36+ and LVS 339) and indirect ELISA (Lassa Nucleo-protein (NP)-Antigen) respectively. RT-PCR outcome was used as standard to test for the sensitivity and specificity of IgM. Results: Of the 37 confirmed cases of LF infection by RT-PCR, 21 (57%) were IgM positive. Amongst the 35 confirmed negative cases (control group), eight were IgM positive. The diagnostic sensitivity and specificity of the IgM assay were 57% and 77% respectively. The negative and positive predictive values of the IgM serological assay were 63% and 72%, respectively, while the efficiency of the test was 67%. Conclusion: The specificity and sensitivity of IgM as a screening tool for early detection of LF appear weak and, hence, the need for a reliable LF “rapid screening kit” since RT-PCR is unavailable in most centers. In the interim, “high clinical index of suspicion,” irrespective of IgM status, requires urgent referral to confirmatory centers. PMID:23661877

  6. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method

    PubMed Central

    2011-01-01

    Background The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. Results A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. Conclusions The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim. PMID:21586134

  7. URPD: a specific product primer design tool

    PubMed Central

    2012-01-01

    Background Polymerase chain reaction (PCR) plays an important role in molecular biology. Primer design fundamentally determines its results. Here, we present a currently available software that is not located in analyzing large sequence but used for a rather straight-forward way of visualizing the primer design process for infrequent users. Findings URPD (yoUR Primer Design), a web-based specific product primer design tool, combines the NCBI Reference Sequences (RefSeq), UCSC In-Silico PCR, memetic algorithm (MA) and genetic algorithm (GA) primer design methods to obtain specific primer sets. A friendly user interface is accomplished by built-in parameter settings. The incorporated smooth pipeline operations effectively guide both occasional and advanced users. URPD contains an automated process, which produces feasible primer pairs that satisfy the specific needs of the experimental design with practical PCR amplifications. Visual virtual gel electrophoresis and in silico PCR provide a simulated PCR environment. The comparison of Practical gel electrophoresis comparison to virtual gel electrophoresis facilitates and verifies the PCR experiment. Wet-laboratory validation proved that the system provides feasible primers. Conclusions URPD is a user-friendly tool that provides specific primer design results. The pipeline design path makes it easy to operate for beginners. URPD also provides a high throughput primer design function. Moreover, the advanced parameter settings assist sophisticated researchers in performing experiential PCR. Several novel functions, such as a nucleotide accession number template sequence input, local and global specificity estimation, primer pair redesign, user-interactive sequence scale selection, and virtual and practical PCR gel electrophoresis discrepancies have been developed and integrated into URPD. The URPD program is implemented in JAVA and freely available at http://bio.kuas.edu.tw/urpd/. PMID:22713312

  8. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method.

    PubMed

    Luna, Augustin; Karac, Evrim I; Sunshine, Margot; Chang, Lucas; Nussinov, Ruth; Aladjem, Mirit I; Kohn, Kurt W

    2011-05-17

    The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim.

  9. URPD: a specific product primer design tool.

    PubMed

    Chuang, Li-Yeh; Cheng, Yu-Huei; Yang, Cheng-Hong

    2012-06-19

    Polymerase chain reaction (PCR) plays an important role in molecular biology. Primer design fundamentally determines its results. Here, we present a currently available software that is not located in analyzing large sequence but used for a rather straight-forward way of visualizing the primer design process for infrequent users. URPD (yoUR Primer Design), a web-based specific product primer design tool, combines the NCBI Reference Sequences (RefSeq), UCSC In-Silico PCR, memetic algorithm (MA) and genetic algorithm (GA) primer design methods to obtain specific primer sets. A friendly user interface is accomplished by built-in parameter settings. The incorporated smooth pipeline operations effectively guide both occasional and advanced users. URPD contains an automated process, which produces feasible primer pairs that satisfy the specific needs of the experimental design with practical PCR amplifications. Visual virtual gel electrophoresis and in silico PCR provide a simulated PCR environment. The comparison of Practical gel electrophoresis comparison to virtual gel electrophoresis facilitates and verifies the PCR experiment. Wet-laboratory validation proved that the system provides feasible primers. URPD is a user-friendly tool that provides specific primer design results. The pipeline design path makes it easy to operate for beginners. URPD also provides a high throughput primer design function. Moreover, the advanced parameter settings assist sophisticated researchers in performing experiential PCR. Several novel functions, such as a nucleotide accession number template sequence input, local and global specificity estimation, primer pair redesign, user-interactive sequence scale selection, and virtual and practical PCR gel electrophoresis discrepancies have been developed and integrated into URPD. The URPD program is implemented in JAVA and freely available at http://bio.kuas.edu.tw/urpd/.

  10. Development and Early Piloting of a CanMEDS Competency-Based Feedback Tool for Surgical Grand Rounds.

    PubMed

    Fahim, Christine; Bhandari, Mohit; Yang, Ilun; Sonnadara, Ranil

    2016-01-01

    Grand rounds offer an excellent opportunity for the evaluation of medical expertise, and other competencies, such as communication and professionalism. The purpose of this study was to develop a tool that would facilitate the provision of formative feedback for grand rounds to improve learning. The resulting CanMEDS-based evaluation tool was piloted in an academic surgical department. This study employed the use of a 3-phase, qualitatively-focused, embedded mixed methods approach. In Phase 1, an intrinsic case study was conducted to identify preliminary themes. These findings were crystallized using a quantitative survey. Following interpretation of these data, a grand rounds evaluation tool was developed in Phase 2. The tool was piloted in the Phase 3 focus group. This study was piloted at an academic surgical center among members of the Department of Surgery, McMaster University, Ontario, Canada. Purposive sampling was used for this study. A total of n = 7 individuals participated in the Phase 1 interviews, and n = 24 participants completed the Phase 1 survey. Participants included a representative sample of medical students, residents, fellows, and staff. The tool was piloted among n = 19 participants. The proposed evaluation tool contains 13 Likert-scale questions and 2 open-ended questions. The tool outlines specific questions to assess grand rounds presenters within the structure of the 7 CanMEDS competency domains. "Evaluation fatigue" was identified as a major barrier in the willingness to provide effective feedback. Further, a number of factors regarding the preferred content, structure, and format of surgical grand rounds were identified. This pilot study presents a CanMEDS-specific evaluation tool that can be applied to surgical grand rounds. With the increasing adoption of competency-based medical education, comprehensive evaluation of surgical activities is required. This form provides a template for the development of competency-based evaluation tools for medical and surgical learning activities. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  11. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    PubMed Central

    Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.

    2017-01-01

    Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135

  12. Application of Pulsed-Field Gel Electrophoresis and Binary Typing as Tools in Veterinary Clinical Microbiology and Molecular Epidemiologic Analysis of Bovine and Human Staphylococcus aureus Isolates

    PubMed Central

    Zadoks, Ruth; van Leeuwen, Willem; Barkema, Herman; Sampimon, Otlis; Verbrugh, Henri; Schukken, Ynte Hein; van Belkum, Alex

    2000-01-01

    Thirty-eight bovine mammary Staphylococcus aureus isolates from diverse clinical, temporal, and geographical origins were genotyped by pulsed-field gel electrophoresis (PFGE) after SmaI digestion of prokaryotic DNA and by means of binary typing using 15 strain-specific DNA probes. Seven pulsed-field types and four subtypes were identified, as were 16 binary types. Concordant delineation of genetic relatedness was documented by both techniques, yet based on practical and epidemiological considerations, binary typing was the preferable method. Genotypes of bovine isolates were compared to 55 previously characterized human S. aureus isolates through cluster analysis of binary types. Genetic clusters containing strains of both human and bovine origin were found, but bacterial genotypes were predominantly associated with a single host species. Binary typing proved an excellent tool for comparison of S. aureus strains, including methicillin-resistant S. aureus, derived from different host species and from different databases. For 28 bovine S. aureus isolates, detailed clinical observations in vivo were compared to strain typing results in vitro. Associations were found between distinct genotypes and severity of disease, suggesting strain-specific bacterial virulence. Circumstantial evidence furthermore supports strain-specific routes of bacterial dissemination. We conclude that PFGE and binary typing can be successfully applied for genetic analysis of S. aureus isolates from bovine mammary secretions. Binary typing in particular is a robust and simple method and promises to become a powerful tool for strain characterization, for resolution of clonal relationships of bacteria within and between host species, and for identification of sources and transmission routes of bovine S. aureus. PMID:10790124

  13. BepiPred-2.0: improving sequence-based B-cell epitope prediction using conformational epitopes

    PubMed Central

    Jespersen, Martin Closter; Peters, Bjoern

    2017-01-01

    Abstract Antibodies have become an indispensable tool for many biotechnological and clinical applications. They bind their molecular target (antigen) by recognizing a portion of its structure (epitope) in a highly specific manner. The ability to predict epitopes from antigen sequences alone is a complex task. Despite substantial effort, limited advancement has been achieved over the last decade in the accuracy of epitope prediction methods, especially for those that rely on the sequence of the antigen only. Here, we present BepiPred-2.0 (http://www.cbs.dtu.dk/services/BepiPred/), a web server for predicting B-cell epitopes from antigen sequences. BepiPred-2.0 is based on a random forest algorithm trained on epitopes annotated from antibody-antigen protein structures. This new method was found to outperform other available tools for sequence-based epitope prediction both on epitope data derived from solved 3D structures, and on a large collection of linear epitopes downloaded from the IEDB database. The method displays results in a user-friendly and informative way, both for computer-savvy and non-expert users. We believe that BepiPred-2.0 will be a valuable tool for the bioinformatics and immunology community. PMID:28472356

  14. SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)

    EPA Science Inventory

    The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...

  15. Decision support methods for the environmental assessment of contamination at mining sites.

    PubMed

    Jordan, Gyozo; Abdaal, Ahmed

    2013-09-01

    Polluting mine accidents and widespread environmental contamination associated with historic mining in Europe and elsewhere has triggered the improvement of related environmental legislation and of the environmental assessment and management methods for the mining industry. Mining has some unique features such as natural background pollution associated with natural mineral deposits, industrial activities and contamination located in the three-dimensional sub-surface space, the problem of long-term remediation after mine closure, problem of secondary contaminated areas around mine sites and abandoned mines in historic regions like Europe. These mining-specific problems require special tools to address the complexity of the environmental problems of mining-related contamination. The objective of this paper is to review and evaluate some of the decision support methods that have been developed and applied to mining contamination. In this paper, only those methods that are both efficient decision support tools and provide a 'holistic' approach to the complex problem as well are considered. These tools are (1) landscape ecology, (2) industrial ecology, (3) landscape geochemistry, (4) geo-environmental models, (5) environmental impact assessment, (6) environmental risk assessment, (7) material flow analysis and (8) life cycle assessment. This unique inter-disciplinary study should enable both the researcher and the practitioner to obtain broad view on the state-of-the-art of decision support methods for the environmental assessment of contamination at mine sites. Documented examples and abundant references are also provided.

  16. Detection methods and performance criteria for genetically modified organisms.

    PubMed

    Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly

    2002-01-01

    Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.

  17. Assessing and Comparing Physical Environments for Nursing Home Residents: Using New Tools for Greater Research Specificity

    ERIC Educational Resources Information Center

    Cutler, Lois J.; Kane, Rosalie A.; Degenholtz, Howard B.; Miller, Michael J.; Grant, Leslie

    2006-01-01

    Purpose: We developed and tested theoretically derived procedures to observe physical environments experienced by nursing home residents at three nested levels: their rooms, the nursing unit, and the overall facility. Illustrating with selected descriptive results, in this article we discuss the development of the approach. Design and Methods: On…

  18. Learning Theories Applied to Teaching Technology: Constructivism versus Behavioral Theory for Instructing Multimedia Software Programs

    ERIC Educational Resources Information Center

    Reed, Cajah S.

    2012-01-01

    This study sought to find evidence for a beneficial learning theory to teach computer software programs. Additionally, software was analyzed for each learning theory's applicability to resolve whether certain software requires a specific method of education. The results are meant to give educators more effective teaching tools, so students…

  19. Data Analysis Tools and Methods for Improving the Interaction Design in E-Learning

    ERIC Educational Resources Information Center

    Popescu, Paul Stefan

    2015-01-01

    In this digital era, learning from data gathered from different software systems may have a great impact on the quality of the interaction experience. There are two main directions that come to enhance this emerging research domain, Intelligent Data Analysis (IDA) and Human Computer Interaction (HCI). HCI specific research methodologies can be…

  20. Using Communities of Practice as a Tool to Analyse Developing Identity in Online Ddiscussion

    ERIC Educational Resources Information Center

    Pratt, Nick; Back, Jenni

    2013-01-01

    In this article, we address the methodological implications of analysing online discussion boards with a focus on participants' changing identities. More specifically, we propose the use of a Communities of Practice framework as a heuristic method for considering how participants' contributions to online discussion play a role in changing who they…

  1. Trypanosomatidae: Phytomonas detection in plants and phytophagous insects by PCR amplification of a genus-specific sequence of the spliced leader gene.

    PubMed

    Serrano, M G; Nunes, L R; Campaner, M; Buck, G A; Camargo, E P; Teixeira, M M

    1999-03-01

    In this paper we describe a method for the detection of Phytomonas spp. from plants and phytophagous insects using the PCR technique by targeting a genus-specific sequence of the spliced leader (SL) gene. PCR amplification of DNA from 48 plant and insect isolates previously classified as Phytomonas by morphological, biochemical, and molecular criteria resulted in all cases in a 100-bp fragment that hybridized with the Phytomonas-specific spliced leader-derived probe SL3'. Moreover, this Phytomonas-specific PCR could also detect Phytomonas spp. in crude preparations of naturally infected plants and insects. This method shows no reaction with any other trypanosomatid genera or with plant and insect host DNA, revealing it to be able to detect Phytomonas spp. from fruit, latex, or phloem of various host plants as well as from salivary glands and digestive tubes of several species of insect hosts. Results demonstrated that SLPCR is a simple, fast, specific, and sensitive method that can be applied to the diagnosis of Phytomonas among cultured trypanosomatids and directly in plants and putative vector insects. Therefore, the method was shown to be a very specific and sensitive tool for diagnosis of Phytomonas without the need for isolation, culture, and DNA extraction of flagellates, a feature that is very convenient for practical and epidemiological purposes. Copyright 1999 Academic Press.

  2. Eliciting women's cervical screening preferences: a mixed methods systematic review protocol.

    PubMed

    Wood, Brianne; Van Katwyk, Susan Rogers; El-Khatib, Ziad; McFaul, Susan; Taljaard, Monica; Wright, Erica; Graham, Ian D; Little, Julian

    2016-08-11

    With the accumulation of evidence regarding potential harms of cancer screening in recent years, researchers, policy-makers, and the public are becoming more critical of population-based cancer screening. Consequently, a high-quality cancer screening program should consider individuals' values and preferences when determining recommendations. In cervical cancer screening, offering women autonomy is considered a "person-centered" approach to health care services; however, it may impact the effectiveness of the program should women choose to not participate. As part of a larger project to investigate women's cervical screening preferences and correlates of these preferences, this systematic review will capture quantitative and qualitative investigations of women's cervical screening preferences and the methods used to elicit them. This mixed methods synthesis will use a thematic analysis approach to synthesize qualitative, quantitative, and mixed methods evidence. This protocol describes the methods that will be used in this investigation. A search strategy has been developed with a health librarian and peer reviewed using PRESS. Based on this strategy, five databases and the gray literature will be searched for studies that meet the inclusion criteria. The quality of the included individual studies will be examined using the Mixed Methods Appraisal Tool. Three reviewers will extract data from the primary studies on the tools or instruments used to elicit women's preferences regarding cervical cancer screening, theoretical frameworks used, outcomes measured, the outstanding themes from quantitative and qualitative evidence, and the identified preferences for cervical cancer screening. We will describe the relationships between study results and the study population, "intervention" (e.g., tool or instrument), and context. We will follow the PRISMA reporting guideline. We will compare findings across studies and between study methods (e.g., qualitative versus quantitative study designs). The strength of the synthesized findings will be assessed using the validated GRADE and CERQual tool. This review will inform the development of a tool to elicit women's cervical screening preferences. Understanding the methods used to elicit women's preferences and what is known about women's cervical screening preferences will be useful for guideline developers who wish to incorporate a woman-centered approach specifically for cervical screening guidelines. PROSPERO CRD42016035737.

  3. Academic health sciences library Website navigation: an analysis of forty-one Websites and their navigation tools

    PubMed Central

    Brower, Stewart M.

    2004-01-01

    Background: The analysis included forty-one academic health sciences library (HSL) Websites as captured in the first two weeks of January 2001. Home pages and persistent navigational tools (PNTs) were analyzed for layout, technology, and links, and other general site metrics were taken. Methods: Websites were selected based on rank in the National Network of Libraries of Medicine, with regional and resource libraries given preference on the basis that these libraries are recognized as leaders in their regions and would be the most reasonable source of standards for best practice. A three-page evaluation tool was developed based on previous similar studies. All forty-one sites were evaluated in four specific areas: library general information, Website aids and tools, library services, and electronic resources. Metrics taken for electronic resources included orientation of bibliographic databases alphabetically by title or by subject area and with links to specifically named databases. Results: Based on the results, a formula for determining obligatory links was developed, listing items that should appear on all academic HSL Web home pages and PNTs. Conclusions: These obligatory links demonstrate a series of best practices that may be followed in the design and construction of academic HSL Websites. PMID:15494756

  4. Clinical application of the basic definition of malnutrition proposed by the European Society for Clinical Nutrition and Metabolism (ESPEN): Comparison with classical tools in geriatric care.

    PubMed

    Sánchez-Rodríguez, Dolores; Annweiler, Cédric; Ronquillo-Moreno, Natalia; Tortosa-Rodríguez, Andrea; Guillén-Solà, Anna; Vázquez-Ibar, Olga; Escalada, Ferran; Muniesa, Josep M; Marco, Ester

    Malnutrition is a prevalent condition related to adverse outcomes in older people. Our aim was to compare the diagnostic capacity of the malnutrition criteria of the European Society of Parenteral and Enteral Nutrition (ESPEN) with other classical diagnostic tools. Cohort study of 102 consecutive in-patients ≥70 years admitted for postacute rehabilitation. Patients were considered malnourished if their Mini-Nutritional Assessment-Short Form (MNA-SF) score was ≤11 and serum albumin <3 mg/dL or MNA-SF ≤ 11, serum albumin <3 mg/dL, and usual clinical signs and symptoms of malnutrition. Sensitivity, specificity, positive and negative predictive values, accuracy likelihood ratios, and kappa values were calculated for both methods: and compared with ESPEN consensus. Of 102 eligible in-patients, 88 fulfilled inclusion criteria and were identified as "at risk" by MNA-SF. Malnutrition diagnosis was confirmed in 11.6% and 10.5% of the patients using classical methods,whereas 19.3% were malnourished according to the ESPEN criteria. Combined with low albumin levels, the diagnosis showed 57.9% sensitivity, 64.5% specificity, 85.9% negative predictive value,0.63 accuracy (fair validity, low range), and kappa index of 0.163 (poor ESPEN agreement). The combination of MNA-SF, low albumin, and clinical malnutrition showed 52.6% sensitivity, 88.3% specificity, 88.3%negative predictive value, and 0.82 accuracy (fair validity, low range), and kappa index of 0.43 (fair ESPEN agreement). Malnutrition was almost twice as prevalent when diagnosed by the ESPEN consensus, compared to classical assessment methods: Classical methods: showed fair validity and poor agreement with the ESPEN consensus in assessing malnutrition in geriatric postacute care. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Machine learning plus optical flow: a simple and sensitive method to detect cardioactive drugs

    NASA Astrophysics Data System (ADS)

    Lee, Eugene K.; Kurokawa, Yosuke K.; Tu, Robin; George, Steven C.; Khine, Michelle

    2015-07-01

    Current preclinical screening methods do not adequately detect cardiotoxicity. Using human induced pluripotent stem cell-derived cardiomyocytes (iPS-CMs), more physiologically relevant preclinical or patient-specific screening to detect potential cardiotoxic effects of drug candidates may be possible. However, one of the persistent challenges for developing a high-throughput drug screening platform using iPS-CMs is the need to develop a simple and reliable method to measure key electrophysiological and contractile parameters. To address this need, we have developed a platform that combines machine learning paired with brightfield optical flow as a simple and robust tool that can automate the detection of cardiomyocyte drug effects. Using three cardioactive drugs of different mechanisms, including those with primarily electrophysiological effects, we demonstrate the general applicability of this screening method to detect subtle changes in cardiomyocyte contraction. Requiring only brightfield images of cardiomyocyte contractions, we detect changes in cardiomyocyte contraction comparable to - and even superior to - fluorescence readouts. This automated method serves as a widely applicable screening tool to characterize the effects of drugs on cardiomyocyte function.

  6. Hybrid texture generator

    NASA Astrophysics Data System (ADS)

    Miyata, Kazunori; Nakajima, Masayuki

    1995-04-01

    A method is given for synthesizing a texture by using the interface of a conventional drawing tool. The majority of conventional texture generation methods are based on the procedural approach, and can generate a variety of textures that are adequate for generating a realistic image. But it is hard for a user to imagine what kind of texture will be generated simply by looking at its parameters. Furthermore, it is difficult to design a new texture freely without a knowledge of all the procedures for texture generation. Our method offers a solution to these problems, and has the following four merits: First, a variety of textures can be obtained by combining a set of feature lines and attribute functions. Second, data definitions are flexible. Third, the user can preview a texture together with its feature lines. Fourth, people can design their own textures interactively and freely by using the interface of a conventional drawing tool. For users who want to build this texture generation method into their own programs, we also give the language specifications for generating a texture. This method can interactively provide a variety of textures, and can also be used for typographic design.

  7. Thermal modelling of cooling tool cutting when milling by electrical analogy

    NASA Astrophysics Data System (ADS)

    Benabid, F.; Arrouf, M.; Assas, M.; Benmoussa, H.

    2010-06-01

    Measurement temperatures by (some devises) are applied immediately after shut-down and may be corrected for the temperature drop that occurs in the interval between shut-down and measurement. This paper presents a new procedure for thermal modelling of the tool cutting used just after machining; when the tool is out off the chip in order to extrapolate the cutting temperature from the temperature measured when the tool is at stand still. A fin approximation is made in enhancing heat loss (by conduction and convection) to air stream is used. In the modelling we introduce an equivalent thermal network to estimate the cutting temperature as a function of specific energy. In another hand, a local modified element lumped conduction equation is used to predict the temperature gradient with time when the tool is being cooled, with initial and boundary conditions. These predictions provide a detailed view of the global heat transfer coefficient as a function of cutting speed because the heat loss for the tool in air stream is an order of magnitude larger than in normal environment. Finally we deduct the cutting temperature by inverse method.

  8. Application of Photoshop and Scion Image analysis to quantification of signals in histochemistry, immunocytochemistry and hybridocytochemistry.

    PubMed

    Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva

    2006-02-01

    To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.

  9. Immunofluorescence Analysis of Endogenous and Exogenous Centromere-kinetochore Proteins

    PubMed Central

    Niikura, Yohei; Kitagawa, Katsumi

    2016-01-01

    "Centromeres" and "kinetochores" refer to the site where chromosomes associate with the spindle during cell division. Direct visualization of centromere-kinetochore proteins during the cell cycle remains a fundamental tool in investigating the mechanism(s) of these proteins. Advanced imaging methods in fluorescence microscopy provide remarkable resolution of centromere-kinetochore components and allow direct observation of specific molecular components of the centromeres and kinetochores. In addition, methods of indirect immunofluorescent (IIF) staining using specific antibodies are crucial to these observations. However, despite numerous reports about IIF protocols, few discussed in detail problems of specific centromere-kinetochore proteins.1-4 Here we report optimized protocols to stain endogenous centromere-kinetochore proteins in human cells by using paraformaldehyde fixation and IIF staining. Furthermore, we report protocols to detect Flag-tagged exogenous CENP-A proteins in human cells subjected to acetone or methanol fixation. These methods are useful in detecting and quantifying endogenous centromere-kinetochore proteins and Flag-tagged CENP-A proteins, including those in human cells. PMID:26967065

  10. Using the case study teaching method to promote college students' critical thinking skills

    NASA Astrophysics Data System (ADS)

    Terry, David Richard

    2007-12-01

    The purpose of this study was to examine general and domain-specific critical thinking skills in college students, particularly ways in which these skills might be increased through the use of the case study method of teaching. General critical thinking skills were measured using the Watson-Glaser Critical Thinking Appraisal (WGCTA) Short Form, a forty-item paper-and-pencil test designed to measure important abilities involved in critical thinking, including inference, recognition of assumptions, deduction, interpretation, and evaluation of arguments. The ability to identify claims and support those claims with evidence is also an important aspect of critical thinking. I developed a new instrument, the Claim and Evidence Assessment Tool (CEAT), to measure these skills in a domain-specific manner. Forty undergraduate students in a general science course for non-science majors at a small two-year college in the northeastern United States experienced positive changes in general critical thinking according to results obtained using the Watson-Glaser Critical Thinking Appraisal (WGCTA). In addition, the students showed cumulative improvement in their ability to identify claims and evidence, as measured by the Claim and Evidence Assessment Tool (CEAT). Mean score on the WGCTA improved from 22.15 +/- 4.59 to 23.48 +/- 4.24 (out of 40), and the mean CEAT score increased from 14.98 +/- 3.28 to 16.20 +/- 3.08 (out of 24). These increases were modest but statistically and educationally significant. No differences in claim and evidence identification were found between students who learned about specific biology topics using the case study method of instruction and those who were engaged in more traditional instruction, and the students' ability to identify claims and evidence and their factual knowledge showed little if any correlation. The results of this research were inconclusive regarding whether or not the case study teaching method promotes college students' general or domain-specific critical thinking skills, and future research addressing this issue should probably utilize larger sample sizes and a pretest-posttest randomized experimental design.

  11. Chromatin analyses of Zymoseptoria tritici: Methods for chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq).

    PubMed

    Soyer, Jessica L; Möller, Mareike; Schotanus, Klaas; Connolly, Lanelle R; Galazka, Jonathan M; Freitag, Michael; Stukenbrock, Eva H

    2015-06-01

    The presence or absence of specific transcription factors, chromatin remodeling machineries, chromatin modification enzymes, post-translational histone modifications and histone variants all play crucial roles in the regulation of pathogenicity genes. Chromatin immunoprecipitation (ChIP) followed by high-throughput sequencing (ChIP-seq) provides an important tool to study genome-wide protein-DNA interactions to help understand gene regulation in the context of native chromatin. ChIP-seq is a convenient in vivo technique to identify, map and characterize occupancy of specific DNA fragments with proteins against which specific antibodies exist or which can be epitope-tagged in vivo. We optimized existing ChIP protocols for use in the wheat pathogen Zymoseptoria tritici and closely related sister species. Here, we provide a detailed method, underscoring which aspects of the technique are organism-specific. Library preparation for Illumina sequencing is described, as this is currently the most widely used ChIP-seq method. One approach for the analysis and visualization of representative sequence is described; improved tools for these analyses are constantly being developed. Using ChIP-seq with antibodies against H3K4me2, which is considered a mark for euchromatin or H3K9me3 and H3K27me3, which are considered marks for heterochromatin, the overall distribution of euchromatin and heterochromatin in the genome of Z. tritici can be determined. Our ChIP-seq protocol was also successfully applied to Z. tritici strains with high levels of melanization or aberrant colony morphology, and to different species of the genus (Z. ardabiliae and Z. pseudotritici), suggesting that our technique is robust. The methods described here provide a powerful framework to study new aspects of chromatin biology and gene regulation in this prominent wheat pathogen. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Geometrical aspects of patient-specific modelling of the intervertebral disc: collagen fibre orientation and residual stress distribution.

    PubMed

    Marini, Giacomo; Studer, Harald; Huber, Gerd; Püschel, Klaus; Ferguson, Stephen J

    2016-06-01

    Patient-specific modelling of the spine is a powerful tool to explore the prevention and the treatment of injuries and pathologies. Albeit several methods have been proposed for the discretization of the bony structures, the efficient representation of the intervertebral disc anisotropy remains a challenge, especially with complex geometries. Furthermore, the swelling of the disc's nucleus pulposus is normally added to the model after geometry definition, at the cost of changes of the material properties and an unrealistic description of the prestressed state. The aim of this study was to develop techniques, which preserve the patient-specific geometry of the disc and allow the representation of the system anisotropy and residual stresses, independent of the system discretization. Depending on the modelling features, the developed approaches resulted in a response of patient-specific models that was in good agreement with the physiological response observed in corresponding experiments. The proposed methods represent a first step towards the development of patient-specific models of the disc which respect both the geometry and the mechanical properties of the specific disc.

  13. The development of a supportive care needs assessment tool for Indigenous people with cancer

    PubMed Central

    2012-01-01

    Background Little is known about the supportive care needs of Indigenous people with cancer and to date, existing needs assessment tools have not considered cultural issues for this population. We aimed to adapt an existing supportive care needs assessment tool for use with Indigenous Australians with cancer. Methods Face-to-face interviews with Indigenous cancer patients (n = 29) and five focus groups with Indigenous key-informants (n = 23) were conducted to assess the face and content validity, cultural acceptability, utility and relevance of the Supportive Care Needs Survey - Short Form 34 (SCNS-SF34) for use with Indigenous patients with cancer. Results All items from the SCNS-SF34 were shortened and changed to use more appropriate language (e.g. the word 'anxiety' was substituted with 'worry'). Seven questions were omitted (e.g. items on death and future considerations) as they were deemed culturally inappropriate or irrelevant and 12 items were added (e.g. accessible transport). Optional instructions were added before the sexual items. The design and response format of the SCNS-SF34 was modified to make it easier to use for Indigenous cancer patients. Given the extensive modifications to the SCNS-SF34 and the liklihood of a different factor structure we consider this tool to be a new tool rather than a modification. The Supportive care needs assessment tool for Indigenous people (SCNAT-IP) shows promising face and content validity and will be useful in informing services where they need to direct their attention for these patients. Conclusions Indigenous people with cancer have language, customs and specific needs that are not accommodated within the standard SCNS-SF34. Our SCNAT-IP improves acceptability, relevance and face validity for Indigenous-specific concerns. Our SCNAT-IP will allow screening for supportive care needs that are specific to Indigenous cancer patients' and greatly inform targeted policy development and practice. PMID:22817614

  14. Subsidence monitoring system for offshore applications: technology scouting and feasibility studies

    NASA Astrophysics Data System (ADS)

    Miandro, R.; Dacome, C.; Mosconi, A.; Roncari, G.

    2015-11-01

    Because of concern about possible impacts of hydrocarbon production activities on coastal-area environments and infrastructures, new hydrocarbon offshore development projects in Italy must submit a monitoring plan to Italian authorities to measure and analyse real-time subsidence evolution. The general geological context, where the main offshore Adriatic fields are located, is represented by young unconsolidated terrigenous sediments. In such geological environments, sea floor subsidence, caused by hydrocarbon extraction, is quite probable. Though many tools are available for subsidence monitoring onshore, few are available for offshore monitoring. To fill the gap ENI (Ente Nazionale Idrocarburi) started a research program, principally in collaboration with three companies, to generate a monitoring system tool to measure seafloor subsidence. The tool, according to ENI design technical-specification, would be a robust long pipeline or cable, with a variable or constant outside diameter (less than or equal to 100 mm) and interval spaced measuring points. The design specifications for the first prototype were: to detect 1 mm altitude variation, to work up to 100 m water depth and investigation length of 3 km. Advanced feasibility studies have been carried out with: Fugro Geoservices B.V. (Netherlands), D'Appolonia (Italy), Agisco (Italy). Five design (using three fundamental measurements concepts and five measurement tools) were explored: cable shape changes measured by cable strain using fiber optics (Fugro); cable inclination measured using tiltmeters (D'Appolonia) and measured using fiber optics (Fugro); and internal cable altitude-dependent pressure changes measured using fiber optics (Fugro) and measured using pressure transducers at discrete intervals along the hydraulic system (Agisco). Each design tool was analysed and a rank ordering of preferences was performed. The third method (measurement of pressure changes), with the solution proposed by Agisco, was deemed most feasible. Agisco is building the first prototype of the tool to be installed in an offshore field in the next few years. This paper describes design of instruments from the three companies to satisfy the design specification.

  15. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  16. Research and Development Project Selection Tools: Probing Wright Laboratory’s Project Selection Methods and Decision Criteria Using the Lateral Airfoil Concept

    DTIC Science & Technology

    1993-09-01

    mismanagement. The broad spectrum of personality types and large sums of money, $43.3 billion in R&D for fiscal year 1993 (FY93) (Goodwin, 1992:57...projects. He used a personal and telephone interview technique to fulfill ten specific objectives. His research provides the first historical data...exploratory nature of the determinant attribute identification process suggests a personal interview format for the data collection method (Emory and Cooper

  17. Alternative stitching method for massively parallel e-beam lithography

    NASA Astrophysics Data System (ADS)

    Brandt, Pieter; Tranquillin, Céline; Wieland, Marco; Bayle, Sébastien; Milléquant, Matthieu; Renault, Guillaume

    2015-07-01

    In this study, a stitching method other than soft edge (SE) and smart boundary (SB) is introduced and benchmarked against SE. The method is based on locally enhanced exposure latitude without throughput cost, making use of the fact that the two beams that pass through the stitching region can deposit up to 2× the nominal dose. The method requires a complex proximity effect correction that takes a preset stitching dose profile into account. Although the principle of the presented stitching method can be multibeam (lithography) systems in general, in this study, the MAPPER FLX 1200 tool is specifically considered. For the latter tool at a metal clip at minimum half-pitch of 32 nm, the stitching method effectively mitigates beam-to-beam (B2B) position errors such that they do not induce an increase in critical dimension uniformity (CDU). In other words, the same CDU can be realized inside the stitching region as outside the stitching region. For the SE method, the CDU inside is 0.3 nm higher than outside the stitching region. A 5-nm direct overlay impact from the B2B position errors cannot be reduced by a stitching strategy.

  18. Pu-239 organ specific dosimetric model applied to non-human biota

    NASA Astrophysics Data System (ADS)

    Kaspar, Matthew Jason

    There are few locations throughout the world, like the Maralinga nuclear test site located in south western Australia, where sufficient plutonium contaminate concentration levels exist that they can be utilized for studies of the long-term radionuclide accumulation in non-human biota. The information obtained will be useful for the potential human users of the site while also keeping with international efforts to better understand doses to non-human biota. In particular, this study focuses primarily on a rabbit sample set collected from the population located within the site. Our approach is intended to employ the same dose and dose rate methods selected by the International Commission on Radiological Protection and adapted by the scientific community for similar research questions. These models rely on a series of simplifying assumptions on biota and their geometry; in particular; organisms are treated as spherical and ellipsoidal representations displaying the animal mass and volume. These simplifications assume homogeneity of all animal tissues. In collaborative efforts between Colorado State University and the Australian Nuclear Science and Technology Organisation (ANSTO), we are expanding current knowledge on radionuclide accumulation in specific organs causing organ-specific dose rates, such as Pu-239 accumulating in bone, liver, and lungs. Organ-specific dose models have been developed for humans; however, little has been developed for the dose assessment to biota, in particular rabbits. This study will determine if it is scientifically valid to use standard software, in particular ERICA Tool, as a means to determine organ-specific dosimetry due to Pu-239 accumulation in organs. ERICA Tool is normally applied to whole organisms as a means to determine radiological risk to whole ecosystems. We will focus on the aquatic model within ERICA Tool, as animal organs, like aquatic organisms, can be assumed to lie within an infinite uniform medium. This model would scientifically be valid for radionuclides emitting short-range radiation, as with Pu-239, where the energy is deposited locally. Two MCNPX models have been created and evaluated against ERICA Tool's aquatic model. One MCNPX model replicates ERICA Tool's intrinsic assumptions while the other uses a more realistic animal model adopted by ICRP Publication 108 and ERICA Tool for the organs "infinite" surrounding universe. In addition, the role of model geometry will be analyzed by focusing on four geometry sets for the same organ, including a spherical geometry. ERICA Tool will be compared to MCNPX results within and between each organ geometry set. In addition, the organ absorbed dose rate will be calculated for six rabbits located on the Maralinga nuclear test site as a preliminary test for further investigation. Data in all cases will be compared using percent differences and Student's t-test with respect to ERICA Tool's results and the overall average organ mean absorbed dose rate.

  19. Interactive film scenes for tutor training in problem-based learning (PBL): dealing with difficult situations

    PubMed Central

    2010-01-01

    Background In problem-based learning (PBL), tutors play an essential role in facilitating and efficiently structuring tutorials to enable students to construct individual cognitive networks, and have a significant impact on students' performance in subsequent assessments. The necessity of elaborate training to fulfil this complex role is undeniable. In the plethora of data on PBL however, little attention has been paid to tutor training which promotes competence in the moderation of specific difficult situations commonly encountered in PBL tutorials. Methods Major interactive obstacles arising in PBL tutorials were identified from prior publications. Potential solutions were defined by an expert group. Video clips were produced addressing the tutor's role and providing exemplary solutions. These clips were embedded in a PBL tutor-training course at our medical faculty combining PBL self-experience with a non-medical case. Trainees provided pre- and post-intervention self-efficacy ratings regarding their PBL-related knowledge, skills, and attitudes, as well as their acceptance and the feasibility of integrating the video clips into PBL tutor-training (all items: 100 = completely agree, 0 = don't agree at all). Results An interactive online tool for PBL tutor training was developed comprising 18 video clips highlighting difficult situations in PBL tutorials to encourage trainees to develop and formulate their own intervention strategies. In subsequent sequences, potential interventions are presented for the specific scenario, with a concluding discussion which addresses unresolved issues. The tool was well accepted and considered worth the time spent on it (81.62 ± 16.91; 62.94 ± 16.76). Tutors considered the videos to prepare them well to respond to specific challenges in future tutorials (75.98 ± 19.46). The entire training, which comprised PBL self-experience and video clips as integral elements, improved tutor's self-efficacy with respect to dealing with problematic situations (pre: 36.47 ± 26.25, post: 66.99 ± 21.01; p < .0001) and significantly increased appreciation of PBL as a method (pre: 61.33 ± 24.84, post: 76.20 ± 20.12; p < .0001). Conclusions The interactive tool with instructional video clips is designed to broaden the view of future PBL tutors in terms of recognizing specific obstacles to functional group dynamics and developing individual intervention strategies. We show that this tool is well accepted and can be successfully integrated into PBL tutor-training. Free access is provided to the entire tool at http://www.medizinische-fakultaet-hd.uni-heidelberg.de/fileadmin/PBLTutorTraining/player.swf. PMID:20604927

  20. A tool to evaluate local biophysical effects on temperature due to land cover change transitions

    NASA Astrophysics Data System (ADS)

    Perugini, Lucia; Caporaso, Luca; Duveiller, Gregory; Cescatti, Alessandro; Abad-Viñas, Raul; Grassi, Giacomo; Quesada, Benjamin

    2017-04-01

    Land Cover Changes (LCC) affect local, regional and global climate through biophysical variations of the surface energy budget mediated by albedo, evapotranspiration, and roughness. Assessment of the full climate impacts of anthropogenic LCC are incomplete without considering biophysical effects, but the high level of uncertainties in quantifying their impacts to date have made it impractical to offer clear advice on which policy makers could act. To overcome this barrier, we provide a tool to evaluate the biophysical impact of a matrix of land cover transitions, following a tiered methodological approach similar to the one provided by the IPCC to estimate the biogeochemical effects, i.e. through three levels of methodological complexity, from Tier 1 (i.e. default method and factors) to Tier 3 (i.e. specific methods and factors). In particular, the tool provides guidance for quantitative assessment of changes in temperature following a land cover transition. The tool focuses on temperature for two main reasons (i) it is the main variable of interest for policy makers at local and regional level, and (ii) temperature is able to summarize the impact of radiative and non-radiative processes following LULCC. The potential changes in annual air temperature that can be expected from various land cover transitions are derived from a dedicated dataset constructed by the JRC in the framework of the LUC4C FP7 project. The inputs for the dataset are air temperature values derived from satellite Earth Observation data (MODIS) and land cover characterization from the ESA Climate Change Initiative product reclassified into their IPCC land use category equivalent. This data, originally at 0.05 degree of spatial resolution, is aggregated and analysed at regional level to provide guidance on the expected temperature impact following specific LCC transitions.

Top