Ying, William; Levons, Jaquan K; Carney, Andrea; Gandhi, Rajesh; Vydra, Vicky; Rubin, A Erik
2016-06-01
A novel semiautomated buffer exchange process workflow was developed to enable efficient early protein formulation screening. An antibody fragment protein, BMSdab, was used to demonstrate the workflow. The process afforded 60% to 80% cycle time and scientist time savings and significant material efficiencies. These efficiencies ultimately facilitated execution of this stability work earlier in the drug development process, allowing this tool to inform the developability of potential candidates for development from a formulation perspective. To overcome the key technical challenges, the protein solution was buffer-exchanged by centrifuge filtration into formulations for stability screening in a 96-well plate with an ultrafiltration membrane, leveraging automated liquid handling and acoustic volume measurements to allow several cycles of exchanges. The formulations were transferred into a vacuum manifold and sterile filtered into a rack holding 96 glass vials. The vials were sealed with a capmat of individual caps and placed in stability stations. Stability of the samples prepared by this process and by the standard process was demonstrated to be comparable. This process enabled screening a number of formulations of a protein at an early pharmaceutical development stage with a short sample preparation time. © 2015 Society for Laboratory Automation and Screening.
Luu, Van; Jona, Janan; Stanton, Mary K; Peterson, Matthew L; Morrison, Henry G; Nagapudi, Karthik; Tan, Helming
2013-01-30
A 96-well high-throughput cocrystal screening workflow has been developed consisting of solvent-mediated sonic blending synthesis and on-plate solid/solution stability characterization by XRPD. A strategy of cocrystallization screening in selected blend solvents including water mixtures is proposed to not only manipulate solubility of the cocrystal components but also differentiate physical stability of the cocrystal products. Caffeine-oxalic acid and theophylline-oxalic acid cocrystals were prepared and evaluated in relation to saturation levels of the cocrystal components and stability of the cocrystal products in anhydrous and hydrous solvents. AMG 517 was screened with a number of coformers, and solid/solution stability of the resulting cocrystals on the 96-well plate was investigated. A stability trend was observed and confirmed that cocrystals comprised of lower aqueous solubility coformers tended to be more stable in water. Furthermore, cocrystals which could be isolated under hydrous solvent blending condition exhibited superior physical stability to those which could only be obtained under anhydrous condition. This integrated HTS workflow provides an efficient route in an API-sparing approach to screen and identify cocrystal candidates with proper solubility and solid/solution stability properties. Copyright © 2012 Elsevier B.V. All rights reserved.
Analysis of protein stability and ligand interactions by thermal shift assay.
Huynh, Kathy; Partch, Carrie L
2015-02-02
Purification of recombinant proteins for biochemical assays and structural studies is time-consuming and presents inherent difficulties that depend on the optimization of protein stability. The use of dyes to monitor thermal denaturation of proteins with sensitive fluorescence detection enables rapid and inexpensive determination of protein stability using real-time PCR instruments. By screening a wide range of solution conditions and additives in a 96-well format, the thermal shift assay easily identifies conditions that significantly enhance the stability of recombinant proteins. The same approach can be used as an initial low-cost screen to discover new protein-ligand interactions by capitalizing on increases in protein stability that typically occur upon ligand binding. This unit presents a methodological workflow for small-scale, high-throughput thermal denaturation of recombinant proteins in the presence of SYPRO Orange dye. Copyright © 2015 John Wiley & Sons, Inc.
Current Protocols in Protein Science
Huynh, Kathy
2015-01-01
The purification of recombinant proteins for biochemical assays and structural studies is time-consuming and presents inherent difficulties that depend on the optimization of protein stability. The use of dyes to monitor thermal denaturation of proteins with sensitive fluorescence detection enables the rapid and inexpensive determination of protein stability using real-time PCR instruments. By screening a wide range of solution conditions and additives in 96-well format, the thermal shift assay easily identifies conditions that significantly enhance the stability of recombinant proteins. The same approach can be used as a low cost, initial screen to discover new protein:ligand interactions by capitalizing on increases in protein stability that typically occur upon ligand binding. This unit presents a methodological workflow for the small-scale, high-throughout thermal denaturation of recombinant proteins in the presence of SYPRO Orange dye. PMID:25640896
NASA Astrophysics Data System (ADS)
Clempner, Julio B.
2017-01-01
This paper presents a novel analytical method for soundness verification of workflow nets and reset workflow nets, using the well-known stability results of Lyapunov for Petri nets. We also prove that the soundness property is decidable for workflow nets and reset workflow nets. In addition, we provide evidence of several outcomes related with properties such as boundedness, liveness, reversibility and blocking using stability. Our approach is validated theoretically and by a numerical example related to traffic signal-control synchronisation.
A reliable computational workflow for the selection of optimal screening libraries.
Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch
2015-01-01
The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.
Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N
2015-11-01
Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices.
Maserat, Elham; Seied Farajollah, Seiede Sedigheh; Safdari, Reza; Ghazisaeedi, Marjan; Aghdaei, Hamid Asadzadeh; Zali, Mohammad Reza
2015-01-01
Colorectal cancer is a major cause of morbidity and mortality throughout the world. Colorectal cancer screening is an optimal way for reducing of morbidity and mortality and a clinical decision support system (CDSS) plays an important role in predicting success of screening processes. DSS is a computer-based information system that improves the delivery of preventive care services. The aim of this article was to detail engineering of information requirements and work flow design of CDSS for a colorectal cancer screening program. In the first stage a screening minimum data set was determined. Developed and developing countries were analyzed for identifying this data set. Then information deficiencies and gaps were determined by check list. The second stage was a qualitative survey with a semi-structured interview as the study tool. A total of 15 users and stakeholders' perspectives about workflow of CDSS were studied. Finally workflow of DSS of control program was designed by standard clinical practice guidelines and perspectives. Screening minimum data set of national colorectal cancer screening program was defined in five sections, including colonoscopy data set, surgery, pathology, genetics and pedigree data set. Deficiencies and information gaps were analyzed. Then we designed a work process standard of screening. Finally workflow of DSS and entry stage were determined. A CDSS facilitates complex decision making for screening and has key roles in designing optimal interactions between colonoscopy, pathology and laboratory departments. Also workflow analysis is useful to identify data reconciliation strategies to address documentation gaps. Following recommendations of CDSS should improve quality of colorectal cancer screening.
PyGOLD: a python based API for docking based virtual screening workflow generation.
Patel, Hitesh; Brinkjost, Tobias; Koch, Oliver
2017-08-15
Molecular docking is one of the successful approaches in structure based discovery and development of bioactive molecules in chemical biology and medicinal chemistry. Due to the huge amount of computational time that is still required, docking is often the last step in a virtual screening approach. Such screenings are set as workflows spanned over many steps, each aiming at different filtering task. These workflows can be automatized in large parts using python based toolkits except for docking using the docking software GOLD. However, within an automated virtual screening workflow it is not feasible to use the GUI in between every step to change the GOLD configuration file. Thus, a python module called PyGOLD was developed, to parse, edit and write the GOLD configuration file and to automate docking based virtual screening workflows. The latest version of PyGOLD, its documentation and example scripts are available at: http://www.ccb.tu-dortmund.de/koch or http://www.agkoch.de. PyGOLD is implemented in Python and can be imported as a standard python module without any further dependencies. oliver.koch@agkoch.de, oliver.koch@tu-dortmund.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Design and implementation of workflow engine for service-oriented architecture
NASA Astrophysics Data System (ADS)
Peng, Shuqing; Duan, Huining; Chen, Deyun
2009-04-01
As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.
Gilmour, Matthew W.; DeGagne, Pat; Nichol, Kim; Karlowsky, James A.
2014-01-01
An efficient workflow to screen for and confirm the presence of carbapenemase-producing Gram-negative bacilli was developed by evaluating five chromogenic screening agar media and two confirmatory assays, the Rapid Carb screen test (Rosco Diagnostica A/S, Taastrup, Denmark) and the modified Hodge test. A panel of 150 isolates was used, including 49 carbapenemase-producing isolates representing a variety of β-lactamase enzyme classes. An evaluation of analytical performance, assay cost, and turnaround time indicated that the preferred workflow (screening test followed by confirmatory testing) was the chromID Carba agar medium (bioMérieux, Marcy l'Étoile, France), followed by the Rapid Carb screen test, yielding a combined sensitivity of 89.8% and a specificity of 100%. As an optional component of the workflow, a determination of carbapenemase gene class via molecular means could be performed subsequent to confirmatory testing. PMID:25355764
Spjuth, Ola; Karlsson, Andreas; Clements, Mark; Humphreys, Keith; Ivansson, Emma; Dowling, Jim; Eklund, Martin; Jauhiainen, Alexandra; Czene, Kamila; Grönberg, Henrik; Sparén, Pär; Wiklund, Fredrik; Cheddad, Abbas; Pálsdóttir, Þorgerður; Rantalainen, Mattias; Abrahamsson, Linda; Laure, Erwin; Litton, Jan-Eric; Palmgren, Juni
2017-09-01
We provide an e-Science perspective on the workflow from risk factor discovery and classification of disease to evaluation of personalized intervention programs. As case studies, we use personalized prostate and breast cancer screenings. We describe an e-Science initiative in Sweden, e-Science for Cancer Prevention and Control (eCPC), which supports biomarker discovery and offers decision support for personalized intervention strategies. The generic eCPC contribution is a workflow with 4 nodes applied iteratively, and the concept of e-Science signifies systematic use of tools from the mathematical, statistical, data, and computer sciences. The eCPC workflow is illustrated through 2 case studies. For prostate cancer, an in-house personalized screening tool, the Stockholm-3 model (S3M), is presented as an alternative to prostate-specific antigen testing alone. S3M is evaluated in a trial setting and plans for rollout in the population are discussed. For breast cancer, new biomarkers based on breast density and molecular profiles are developed and the US multicenter Women Informed to Screen Depending on Measures (WISDOM) trial is referred to for evaluation. While current eCPC data management uses a traditional data warehouse model, we discuss eCPC-developed features of a coherent data integration platform. E-Science tools are a key part of an evidence-based process for personalized medicine. This paper provides a structured workflow from data and models to evaluation of new personalized intervention strategies. The importance of multidisciplinary collaboration is emphasized. Importantly, the generic concepts of the suggested eCPC workflow are transferrable to other disease domains, although each disease will require tailored solutions. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
2011-01-01
Background Based on barriers to the use of computerized clinical decision support (CDS) learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's) colorectal cancer (CRC) screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs) used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice. PMID:22126324
Developing science gateways for drug discovery in a grid environment.
Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra
2016-01-01
Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.
Developments in SPR Fragment Screening.
Chavanieu, Alain; Pugnière, Martine
2016-01-01
Fragment-based approaches have played an increasing role alongside high-throughput screening in drug discovery for 15 years. The label-free biosensor technology based on surface plasmon resonance (SPR) is now sensitive and informative enough to serve during primary screens and validation steps. In this review, the authors discuss the role of SPR in fragment screening. After a brief description of the underlying principles of the technique and main device developments, they evaluate the advantages and adaptations of SPR for fragment-based drug discovery. SPR can also be applied to challenging targets such as membrane receptors and enzymes. The high-level of immobilization of the protein target and its stability are key points for a relevant screening that can be optimized using oriented immobilized proteins and regenerable sensors. Furthermore, to decrease the rate of false negatives, a selectivity test may be performed in parallel on the main target bearing the binding site mutated or blocked with a low-off-rate ligand. Fragment-based drug design, integrated in a rational workflow led by SPR, will thus have a predominant role for the next wave of drug discovery which could be greatly enhanced by new improvements in SPR devices.
Lu, Xinyan
2016-01-01
There is a clear requirement for enhancing laboratory information management during early absorption, distribution, metabolism and excretion (ADME) screening. The application of a commercial laboratory information management system (LIMS) is limited by complexity, insufficient flexibility, high costs and extended timelines. An improved custom in-house LIMS for ADME screening was developed using Excel. All Excel templates were generated through macros and formulae, and information flow was streamlined as much as possible. This system has been successfully applied in task generation, process control and data management, with a reduction in both labor time and human error rates. An Excel-based LIMS can provide a simple, flexible and cost/time-saving solution for improving workflow efficiencies in early ADME screening.
AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.
New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.« less
ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)
US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...
Impact of digital radiography on clinical workflow.
May, G A; Deer, D D; Dackiewicz, D
2000-05-01
It is commonly accepted that digital radiography (DR) improves workflow and patient throughput compared with traditional film radiography or computed radiography (CR). DR eliminates the film development step and the time to acquire the image from a CR reader. In addition, the wide dynamic range of DR is such that the technologist can perform the quality-control (QC) step directly at the modality in a few seconds, rather than having to transport the newly acquired image to a centralized QC station for review. Furthermore, additional workflow efficiencies can be achieved with DR by employing tight radiology information system (RIS) integration. In the DR imaging environment, this provides for patient demographic information to be automatically downloaded from the RIS to populate the DR Digital Imaging and Communications in Medicine (DICOM) image header. To learn more about this workflow efficiency improvement, we performed a comparative study of workflow steps under three different conditions: traditional film/screen x-ray, DR without RIS integration (ie, manual entry of patient demographics), and DR with RIS integration. This study was performed at the Cleveland Clinic Foundation (Cleveland, OH) using a newly acquired amorphous silicon flat-panel DR system from Canon Medical Systems (Irvine, CA). Our data show that DR without RIS results in substantial workflow savings over traditional film/screen practice. There is an additional 30% reduction in total examination time using DR with RIS integration.
Improving data collection, documentation, and workflow in a dementia screening study.
Read, Kevin B; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I; Galvin, James E; Surkis, Alisa
2017-04-01
A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies' data collection, entry, and processing workflows. The librarians' role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library's broader user community.
NASA Astrophysics Data System (ADS)
Li, Xiayue; Curtis, Farren S.; Rose, Timothy; Schober, Christoph; Vazquez-Mayagoitia, Alvaro; Reuter, Karsten; Oberhofer, Harald; Marom, Noa
2018-06-01
We present Genarris, a Python package that performs configuration space screening for molecular crystals of rigid molecules by random sampling with physical constraints. For fast energy evaluations, Genarris employs a Harris approximation, whereby the total density of a molecular crystal is constructed via superposition of single molecule densities. Dispersion-inclusive density functional theory is then used for the Harris density without performing a self-consistency cycle. Genarris uses machine learning for clustering, based on a relative coordinate descriptor developed specifically for molecular crystals, which is shown to be robust in identifying packing motif similarity. In addition to random structure generation, Genarris offers three workflows based on different sequences of successive clustering and selection steps: the "Rigorous" workflow is an exhaustive exploration of the potential energy landscape, the "Energy" workflow produces a set of low energy structures, and the "Diverse" workflow produces a maximally diverse set of structures. The latter is recommended for generating initial populations for genetic algorithms. Here, the implementation of Genarris is reported and its application is demonstrated for three test cases.
FluxCTTX: A LIMS-based tool for management and analysis of cytotoxicity assays data
2015-01-01
Background Cytotoxicity assays have been used by researchers to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds or screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical. These assays may be used as an alternative to animal experimentation and are becoming increasingly important in modern laboratories. However, the execution of these assays in large scale and different laboratories requires, among other things, the management of protocols, reagents, cell lines used as well as the data produced, which can be a challenge. The management of all this information is greatly improved by the utilization of computational tools to save time and guarantee quality. However, a tool that performs this task designed specifically for cytotoxicity assays is not yet available. Results In this work, we have used a workflow based LIMS -- the Flux system -- and the Together Workflow Editor as a framework to develop FluxCTTX, a tool for management of data from cytotoxicity assays performed at different laboratories. The main work is the development of a workflow, which represents all stages of the assay and has been developed and uploaded in Flux. This workflow models the activities of cytotoxicity assays performed as described in the OECD 129 Guidance Document. Conclusions FluxCTTX presents a solution for the management of the data produced by cytotoxicity assays performed at Interlaboratory comparisons. Its adoption will contribute to guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of Good Laboratory Practices (GLP). Furthermore, the workflow developed is complete and can be adapted to other contexts and different tests for management of other types of data. PMID:26696462
Fragment-based screening in tandem with phenotypic screening provides novel antiparasitic hits.
Blaazer, Antoni R; Orrling, Kristina M; Shanmugham, Anitha; Jansen, Chimed; Maes, Louis; Edink, Ewald; Sterk, Geert Jan; Siderius, Marco; England, Paul; Bailey, David; de Esch, Iwan J P; Leurs, Rob
2015-01-01
Methods to discover biologically active small molecules include target-based and phenotypic screening approaches. One of the main difficulties in drug discovery is elucidating and exploiting the relationship between drug activity at the protein target and disease modification, a phenotypic endpoint. Fragment-based drug discovery is a target-based approach that typically involves the screening of a relatively small number of fragment-like (molecular weight <300) molecules that efficiently cover chemical space. Here, we report a fragment screening on TbrPDEB1, an essential cyclic nucleotide phosphodiesterase (PDE) from Trypanosoma brucei, and human PDE4D, an off-target, in a workflow in which fragment hits and a series of close analogs are subsequently screened for antiparasitic activity in a phenotypic panel. The phenotypic panel contained T. brucei, Trypanosoma cruzi, Leishmania infantum, and Plasmodium falciparum, the causative agents of human African trypanosomiasis (sleeping sickness), Chagas disease, leishmaniasis, and malaria, respectively, as well as MRC-5 human lung cells. This hybrid screening workflow has resulted in the discovery of various benzhydryl ethers with antiprotozoal activity and low toxicity, representing interesting starting points for further antiparasitic optimization. © 2014 Society for Laboratory Automation and Screening.
Improving data collection, documentation, and workflow in a dementia screening study
Read, Kevin B.; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I.; Galvin, James E.; Surkis, Alisa
2017-01-01
Background A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies’ data collection, entry, and processing workflows. Case Presentation The librarians’ role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. Conclusions NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library’s broader user community. PMID:28377680
Berkley, Holly; Barnes, Matthew; Carnahan, David; Hayhurst, Janet; Bockhorst, Archie; Neville, James
2017-03-01
To describe the use of template-based screening for risk of infectious disease exposure of patients presenting to primary care medical facilities during the 2014 West African Ebola virus outbreak. The Military Health System implemented an Ebola risk-screening tool in primary care settings in order to create early notifications and early responses to potentially infected persons. Three time-sensitive, evidence-based screening questions were developed and posted to Tri-Service Workflow (TSWF) AHLTA templates in conjunction with appropriate training. Data were collected in January 2015, to assess the adoption of the TSWF-based Ebola risk-screening tool. Among encounters documented using TSWF templates, 41% of all encounters showed use of the TSWF-based Ebola risk-screening questions by the fourth day. The screening rate increased over the next 3 weeks, and reached a plateau at approximately 50%. This report demonstrates the MHS capability to deploy a standardized, globally applicable decision support aid that could be seen the same day by all primary care clinics across the military health direct care system, potentially improving rapid compliance with screening directives. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.
Human Systems Integration Design Environment (HSIDE)
2012-04-09
quality of the resulting HSI products. 15. SUBJECT TERMS HSI , Manning Estimation and Validation , Risk Assessment, I POE, PLM, BPMN , Workflow...business process model in Business Process Modeling Notation ( BPMN ) or the actual workflow template associated with the specific functional area, again...as filtered by the user settings in the high level interface. Figure 3 shows the initial screen which allows the user to select either the BPMN or
Large datasets, logistics, sharing and workflow in screening.
Cook, Tessa S
2018-03-29
Cancer screening initiatives exist around the world for different malignancies, most frequently breast, colorectal, and cervical cancer. A number of cancer registries exist to collect relevant data, but while these data may include imaging findings, they rarely, if ever, include actual images. Additionally, the data submitted to the registry are usually correlated with eventual cancer diagnoses and patient outcomes, rather than used with the individual's future screenings. Developing screening programs that allow for images to be submitted to a central location in addition to patient meta data and used for comparison to future screening exams would be very valuable in increasing access to care and ensuring that individuals are effectively screened at appropriate intervals. It would also change the way imaging results and additional patient data are correlated to eventual outcomes. However, it introduces logistical challenges surrounding secure storage and transmission of data to subsequent screening sites. In addition, in the absence of standardized protocols for screening, comparing current and prior imaging, especially from different equipment, can be challenging. Implementing a large-scale screening program with an image-enriched screening registry-effectively, an image-enriched electronic screening record-also requires that incentives exist for screening sites, physicians, and patients to participate; to maximize coverage, participation may have to be supported by government agencies. Workflows will also have to be adjusted to support registry participation for all screening patients in an effort to create a large, robust data set that can be used for future screening efforts as well as research initiatives.center.
Performance Studies on Distributed Virtual Screening
Krüger, Jens; de la Garza, Luis; Kohlbacher, Oliver; Nagel, Wolfgang E.
2014-01-01
Virtual high-throughput screening (vHTS) is an invaluable method in modern drug discovery. It permits screening large datasets or databases of chemical structures for those structures binding possibly to a drug target. Virtual screening is typically performed by docking code, which often runs sequentially. Processing of huge vHTS datasets can be parallelized by chunking the data because individual docking runs are independent of each other. The goal of this work is to find an optimal splitting maximizing the speedup while considering overhead and available cores on Distributed Computing Infrastructures (DCIs). We have conducted thorough performance studies accounting not only for the runtime of the docking itself, but also for structure preparation. Performance studies were conducted via the workflow-enabled science gateway MoSGrid (Molecular Simulation Grid). As input we used benchmark datasets for protein kinases. Our performance studies show that docking workflows can be made to scale almost linearly up to 500 concurrent processes distributed even over large DCIs, thus accelerating vHTS campaigns significantly. PMID:25032219
Maximum unbiased validation (MUV) data sets for virtual screening based on PubChem bioactivity data.
Rohrer, Sebastian G; Baumann, Knut
2009-02-01
Refined nearest neighbor analysis was recently introduced for the analysis of virtual screening benchmark data sets. It constitutes a technique from the field of spatial statistics and provides a mathematical framework for the nonparametric analysis of mapped point patterns. Here, refined nearest neighbor analysis is used to design benchmark data sets for virtual screening based on PubChem bioactivity data. A workflow is devised that purges data sets of compounds active against pharmaceutically relevant targets from unselective hits. Topological optimization using experimental design strategies monitored by refined nearest neighbor analysis functions is applied to generate corresponding data sets of actives and decoys that are unbiased with regard to analogue bias and artificial enrichment. These data sets provide a tool for Maximum Unbiased Validation (MUV) of virtual screening methods. The data sets and a software package implementing the MUV design workflow are freely available at http://www.pharmchem.tu-bs.de/lehre/baumann/MUV.html.
Development of a High-Throughput Ion-Exchange Resin Characterization Workflow.
Liu, Chun; Dermody, Daniel; Harris, Keith; Boomgaard, Thomas; Sweeney, Jeff; Gisch, Daryl; Goltz, Bob
2017-06-12
A novel high-throughout (HTR) ion-exchange (IEX) resin workflow has been developed for characterizing ion exchange equilibrium of commercial and experimental IEX resins against a range of different applications where water environment differs from site to site. Because of its much higher throughput, design of experiment (DOE) methodology can be easily applied for studying the effects of multiple factors on resin performance. Two case studies will be presented to illustrate the efficacy of the combined HTR workflow and DOE method. In case study one, a series of anion exchange resins have been screened for selective removal of NO 3 - and NO 2 - in water environments consisting of multiple other anions, varied pH, and ionic strength. The response surface model (RSM) is developed to statistically correlate the resin performance with the water composition and predict the best resin candidate. In case study two, the same HTR workflow and DOE method have been applied for screening different cation exchange resins in terms of the selective removal of Mg 2+ , Ca 2+ , and Ba 2+ from high total dissolved salt (TDS) water. A master DOE model including all of the cation exchange resins is created to predict divalent cation removal by different IEX resins under specific conditions, from which the best resin candidates can be identified. The successful adoption of HTR workflow and DOE method for studying the ion exchange of IEX resins can significantly reduce the resources and time to address industry and application needs.
The Diabetic Retinopathy Screening Workflow
Bolster, Nigel M.; Giardini, Mario E.; Bastawrous, Andrew
2015-01-01
Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems’ use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. PMID:26596630
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
2013-01-01
Background Information is lacking about the capacity of those working in community practice settings to utilize health information technology for colorectal cancer screening. Objective To address this gap we asked those working in community practice settings to share their perspectives about how the implementation of a Web-based patient-led decision aid might affect patient-clinician conversations about colorectal cancer screening and the day-to-day clinical workflow. Methods Five focus groups in five community practice settings were conducted with 8 physicians, 1 physician assistant, and 18 clinic staff. Focus groups were organized using a semistructured discussion guide designed to identify factors that mediate and impede the use of a Web-based decision aid intended to clarify patient preferences for colorectal cancer screening and to trigger shared decision making during the clinical encounter. Results All physicians, the physician assistant, and 8 of the 18 clinic staff were active participants in the focus groups. Clinician and staff participants from each setting reported a belief that the Web-based patient-led decision aid could be an informative and educational tool; in all but one setting participants reported a readiness to recommend the tool to patients. The exception related to clinicians from one clinic who described a preference for patients having fewer screening choices, noting that a colonoscopy was the preferred screening modality for patients in their clinic. Perceived barriers to utilizing the Web-based decision aid included patients’ lack of Internet access or low computer literacy, and potential impediments to the clinics’ daily workflow. Expanding patients’ use of an online decision aid that is both easy to access and understand and that is utilized by patients outside of the office visit was described as a potentially efficient means for soliciting patients’ screening preferences. Participants described that a system to link the online decision aid to a computerized reminder system could promote a better understanding of patients’ screening preferences, though some expressed concern that such a system could be difficult to keep up and running. Conclusions Community practice clinicians and staff perceived the Web-based decision aid technology as promising but raised questions as to how the technology and resultant information would be integrated into their daily practice workflow. Additional research investigating how to best implement online decision aids should be conducted prior to the widespread adoption of such technology so as to maximize the benefits of the technology while minimizing workflow disruptions. PMID:24351420
Applying operations research to optimize a novel population management system for cancer screening.
Zai, Adrian H; Kim, Seokjin; Kamis, Arnold; Hung, Ken; Ronquillo, Jeremiah G; Chueh, Henry C; Atlas, Steven J
2014-02-01
To optimize a new visit-independent, population-based cancer screening system (TopCare) by using operations research techniques to simulate changes in patient outreach staffing levels (delegates, navigators), modifications to user workflow within the information technology (IT) system, and changes in cancer screening recommendations. TopCare was modeled as a multiserver, multiphase queueing system. Simulation experiments implemented the queueing network model following a next-event time-advance mechanism, in which systematic adjustments were made to staffing levels, IT workflow settings, and cancer screening frequency in order to assess their impact on overdue screenings per patient. TopCare reduced the average number of overdue screenings per patient from 1.17 at inception to 0.86 during simulation to 0.23 at steady state. Increases in the workforce improved the effectiveness of TopCare. In particular, increasing the delegate or navigator staff level by one person improved screening completion rates by 1.3% or 12.2%, respectively. In contrast, changes in the amount of time a patient entry stays on delegate and navigator lists had little impact on overdue screenings. Finally, lengthening the screening interval increased efficiency within TopCare by decreasing overdue screenings at the patient level, resulting in a smaller number of overdue patients needing delegates for screening and a higher fraction of screenings completed by delegates. Simulating the impact of changes in staffing, system parameters, and clinical inputs on the effectiveness and efficiency of care can inform the allocation of limited resources in population management.
Scholz, Christoph; Knorr, Sabine; Hamacher, Kay; Schmidt, Boris
2015-02-23
The formation of a covalent bond with the target is essential for a number of successful drugs, yet tools for covalent docking without significant restrictions regarding warhead or receptor classes are rare and limited in use. In this work we present DOCKTITE, a highly versatile workflow for covalent docking in the Molecular Operating Environment (MOE) combining automated warhead screening, nucleophilic side chain attachment, pharmacophore-based docking, and a novel consensus scoring approach. The comprehensive validation study includes pose predictions of 35 protein/ligand complexes which resulted in a mean RMSD of 1.74 Å and a prediction rate of 71.4% with an RMSD below 2 Å, a virtual screening with an area under the curve (AUC) for the receiver operating characteristics (ROC) of 0.81, and a significant correlation between predicted and experimental binding affinities (ρ = 0.806, R(2) = 0.649, p < 0.005).
A Workflow for Identifying Metabolically Active Chemicals to Complement in vitro Toxicity Screening
The new paradigm of toxicity testing approaches involves rapid screening of thousands of chemicals across hundreds of biological targets through use of in vitro assays. Such assays may lead to false negatives when the complex metabolic processes that render a chemical bioactive i...
Applying operations research to optimize a novel population management system for cancer screening
Zai, Adrian H; Kim, Seokjin; Kamis, Arnold; Hung, Ken; Ronquillo, Jeremiah G; Chueh, Henry C; Atlas, Steven J
2014-01-01
Objective To optimize a new visit-independent, population-based cancer screening system (TopCare) by using operations research techniques to simulate changes in patient outreach staffing levels (delegates, navigators), modifications to user workflow within the information technology (IT) system, and changes in cancer screening recommendations. Materials and methods TopCare was modeled as a multiserver, multiphase queueing system. Simulation experiments implemented the queueing network model following a next-event time-advance mechanism, in which systematic adjustments were made to staffing levels, IT workflow settings, and cancer screening frequency in order to assess their impact on overdue screenings per patient. Results TopCare reduced the average number of overdue screenings per patient from 1.17 at inception to 0.86 during simulation to 0.23 at steady state. Increases in the workforce improved the effectiveness of TopCare. In particular, increasing the delegate or navigator staff level by one person improved screening completion rates by 1.3% or 12.2%, respectively. In contrast, changes in the amount of time a patient entry stays on delegate and navigator lists had little impact on overdue screenings. Finally, lengthening the screening interval increased efficiency within TopCare by decreasing overdue screenings at the patient level, resulting in a smaller number of overdue patients needing delegates for screening and a higher fraction of screenings completed by delegates. Conclusions Simulating the impact of changes in staffing, system parameters, and clinical inputs on the effectiveness and efficiency of care can inform the allocation of limited resources in population management. PMID:24043318
The Diabetic Retinopathy Screening Workflow: Potential for Smartphone Imaging.
Bolster, Nigel M; Giardini, Mario E; Bastawrous, Andrew
2015-11-23
Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems' use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. © 2015 Diabetes Technology Society.
NASA Astrophysics Data System (ADS)
Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes
2002-05-01
If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.
NASA Astrophysics Data System (ADS)
Huang, Yining; Salinas, Nichole D.; Chen, Edwin; Tolia, Niraj H.; Gross, Michael L.
2017-09-01
Plasmodium vivax Duffy Binding Protein (PvDBP) is a promising vaccine candidate for P. vivax malaria. Recently, we reported the epitopes on PvDBP region II (PvDBP-II) for three inhibitory monoclonal antibodies (2D10, 2H2, and 2C6). In this communication, we describe the combination of native mass spectrometry and ion mobility (IM) with collision induced unfolding (CIU) to study the conformation and stabilities of three malarial antigen-antibody complexes. These complexes, when collisionally activated, undergo conformational changes that depend on the location of the epitope. CIU patterns for PvDBP-II in complex with antibody 2D10 and 2H2 are highly similar, indicating comparable binding topology and stability. A different CIU fingerprint is observed for PvDBP-II/2C6, indicating that 2C6 binds to PvDBP-II on an epitope different from 2D10 and 2H2. This work supports the use of CIU as a means of classifying antigen-antibody complexes by their epitope maps in a high throughput screening workflow. [Figure not available: see fulltext.
Flexible End2End Workflow Automation of Hit-Discovery Research.
Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin
2014-08-01
The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.
Le-Thi-Thu, Huong; Casanola-Martín, Gerardo M; Marrero-Ponce, Yovani; Rescigno, Antonio; Abad, Concepcion; Khan, Mahmud Tareq Hassan
2014-01-01
The tyrosinase is a bifunctional, copper-containing enzyme widely distributed in the phylogenetic tree. This enzyme is involved in the production of melanin and some other pigments in humans, animals and plants, including skin pigmentations in mammals, and browning process in plants and vegetables. Therefore, enzyme inhibitors has been under the attention of the scientist community, due to its broad applications in food, cosmetic, agricultural and medicinal fields, to avoid the undesirable effects of abnormal melanin overproduction. However, the research of novel chemical with antityrosinase activity demands the use of more efficient tools to speed up the tyrosinase inhibitors discovery process. This chapter is focused in the different components of a predictive modeling workflow for the identification and prioritization of potential new compounds with activity against the tyrosinase enzyme. In this case, two structure chemical libraries Spectrum Collection and Drugbank are used in this attempt to combine different virtual screening data mining techniques, in a sequential manner helping to avoid the usually expensive and time consuming traditional methods. Some of the sequential steps summarize here comprise the use of drug-likeness filters, similarity searching, classification and potency QSAR multiclassifier systems, modeling molecular interactions systems, and similarity/diversity analysis. Finally, the methodologies showed here provide a rational workflow for virtual screening hit analysis and selection as a promissory drug discovery strategy for use in target identification phase.
ToxCast Data Generation: Chemical Workflow
This page describes the process EPA follows to select chemicals, procure chemicals, register chemicals, conduct a quality review of the chemicals, and prepare the chemicals for high-throughput screening.
Grant, Richard John; Roberts, Karen; Pointon, Carly; Hodgson, Clare; Womersley, Lynsey; Jones, Darren Craig; Tang, Eric
2009-06-01
Compound handling is a fundamental and critical step in compound screening throughout the drug discovery process. Although most compound-handling processes within compound management facilities use 100% DMSO solvent, conventional methods of manual or robotic liquid-handling systems in screening workflows often perform dilutions in aqueous solutions to maintain solvent tolerance of the biological assay. However, the use of aqueous media in these applications can lead to suboptimal data quality due to compound carryover or precipitation during the dilution steps. In cell-based assays, this effect is worsened by the unpredictable physical characteristics of compounds and the low DMSO tolerance within the assay. In some cases, the conventional approaches using manual or automated liquid handling resulted in variable IC(50) dose responses. This study examines the cause of this variability and evaluates the accuracy of screening data in these case studies. A number of liquid-handling options have been explored to address the issues and establish a generic compound-handling workflow to support cell-based screening across our screening functions. The authors discuss the validation of the Labcyte Echo reformatter as an effective noncontact solution for generic compound-handling applications against diverse compound classes using triple-quad liquid chromatography/mass spectrometry. The successful validation and implementation challenges of this technology for direct dosing onto cells in cell-based screening is discussed.
Usability Testing of a National Substance Use Screening Tool Embedded in Electronic Health Records.
Press, Anne; DeStio, Catherine; McCullagh, Lauren; Kapoor, Sandeep; Morley, Jeanne; Conigliaro, Joseph
2016-07-08
Screening, brief intervention, and referral to treatment (SBIRT) is currently being implemented into health systems nationally via paper and electronic methods. The purpose of this study was to evaluate the integration of an electronic SBIRT tool into an existing paper-based SBIRT clinical workflow in a patient-centered medical home. Usability testing was conducted in an academic ambulatory clinic. Two rounds of usability testing were done with medical office assistants (MOAs) using a paper and electronic version of the SBIRT tool, with two and four participants, respectively. Qualitative and quantitative data was analyzed to determine the impact of both tools on clinical workflow. A second round of usability testing was done with the revised electronic version and compared with the first version. Personal workflow barriers cited in the first round of testing were that the electronic health record (EHR) tool was disruptive to patient's visits. In Round 2 of testing, MOAs reported favoring the electronic version due to improved layout and the inclusion of an alert system embedded in the EHR. For example, using the system usability scale (SUS), MOAs reported a grade "1" for the statement, "I would like to use this system frequently" during the first round of testing but a "5" during the second round of analysis. The importance of testing usability of various mediums of tools used in health care screening is highlighted by the findings of this study. In the first round of testing, the electronic tool was reported as less user friendly, being difficult to navigate, and time consuming. Many issues faced in the first generation of the tool were improved in the second generation after usability was evaluated. This study demonstrates how usability testing of an electronic SBRIT tool can help to identify challenges that can impact clinical workflow. However, a limitation of this study was the small sample size of MOAs that participated. The results may have been biased to Northwell Health workers' perceptions of the SBIRT tool and their specific clinical workflow.
Stockwell, Simon R; Mittnacht, Sibylle
2014-12-16
Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software(1) to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.
Javan Amoli, Amir Hossein; Maserat, Elham; Safdari, Reza; Zali, Mohammad Reza
2015-01-01
Decision making modalities for screening for many cancer conditions and different stages have become increasingly complex. Computer-based risk assessment systems facilitate scheduling and decision making and support the delivery of cancer screening services. The aim of this article was to survey electronic risk assessment system as an appropriate tool for the prevention of cancer. A qualitative design was used involving 21 face-to-face interviews. Interviewing involved asking questions and getting answers from exclusive managers of cancer screening. Of the participants 6 were female and 15 were male, and ages ranged from 32 to 78 years. The study was based on a grounded theory approach and the tool was a semi- structured interview. Researchers studied 5 dimensions, comprising electronic guideline standards of colorectal cancer screening, work flow of clinical and genetic activities, pathways of colorectal cancer screening and functionality of computer based guidelines and barriers. Electronic guideline standards of colorectal cancer screening were described in the s3 categories of content standard, telecommunications and technical standards and nomenclature and classification standards. According to the participations' views, workflow and genetic pathways of colorectal cancer screening were identified. The study demonstrated an effective role of computer-guided consultation for screening management. Electronic based systems facilitate real-time decision making during a clinical interaction. Electronic pathways have been applied for clinical and genetic decision support, workflow management, update recommendation and resource estimates. A suitable technical and clinical infrastructure is an integral part of clinical practice guidline of screening. As a conclusion, it is recommended to consider the necessity of architecture assessment and also integration standards.
PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.
Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier
2017-11-20
Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.
First-principles data-driven discovery of transition metal oxides for artificial photosynthesis
NASA Astrophysics Data System (ADS)
Yan, Qimin
We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.
MouseNet database: digital management of a large-scale mutagenesis project.
Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M
2000-07-01
The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.
2017-01-01
Computational screening is a method to prioritize small-molecule compounds based on the structural and biochemical attributes built from ligand and target information. Previously, we have developed a scalable virtual screening workflow to identify novel multitarget kinase/bromodomain inhibitors. In the current study, we identified several novel N-[3-(2-oxo-pyrrolidinyl)phenyl]-benzenesulfonamide derivatives that scored highly in our ensemble docking protocol. We quantified the binding affinity of these compounds for BRD4(BD1) biochemically and generated cocrystal structures, which were deposited in the Protein Data Bank. As the docking poses obtained in the virtual screening pipeline did not align with the experimental cocrystal structures, we evaluated the predictions of their precise binding modes by performing molecular dynamics (MD) simulations. The MD simulations closely reproduced the experimentally observed protein–ligand cocrystal binding conformations and interactions for all compounds. These results suggest a computational workflow to generate experimental-quality protein–ligand binding models, overcoming limitations of docking results due to receptor flexibility and incomplete sampling, as a useful starting point for the structure-based lead optimization of novel BRD4(BD1) inhibitors. PMID:28884163
Anderson, Ericka L.; Li, Weizhong; Klitgord, Niels; Highlander, Sarah K.; Dayrit, Mark; Seguritan, Victor; Yooseph, Shibu; Biggs, William; Venter, J. Craig; Nelson, Karen E.; Jones, Marcus B.
2016-01-01
As reports on possible associations between microbes and the host increase in number, more meaningful interpretations of this information require an ability to compare data sets across studies. This is dependent upon standardization of workflows to ensure comparability both within and between studies. Here we propose the standard use of an alternate collection and stabilization method that would facilitate such comparisons. The DNA Genotek OMNIgene∙Gut Stool Microbiome Kit was compared to the currently accepted community standard of freezing to store human stool samples prior to whole genome sequencing (WGS) for microbiome studies. This stabilization and collection device allows for ambient temperature storage, automation, and ease of shipping/transfer of samples. The device permitted the same data reproducibility as with frozen samples, and yielded higher recovery of nucleic acids. Collection and stabilization of stool microbiome samples with the DNA Genotek collection device, combined with our extraction and WGS, provides a robust, reproducible workflow that enables standardized global collection, storage, and analysis of stool for microbiome studies. PMID:27558918
Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey
2011-11-01
In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers
Cloke, Jonathan; Matheny, Sharon; Swimley, Michelle; Tebbs, Robert; Burrell, Angelia; Flannery, Jonathan; Bastin, Benjamin; Bird, Patrick; Benzinger, M Joseph; Crowley, Erin; Agin, James; Goins, David; Salfinger, Yvonne; Brodsky, Michael; Fernandez, Maria Cristina
2016-11-01
The Applied Biosystems™ RapidFinder™ STEC Detection Workflow (Thermo Fisher Scientific) is a complete protocol for the rapid qualitative detection of Escherichia coli (E. coli) O157:H7 and the "Big 6" non-O157 Shiga-like toxin-producing E. coli (STEC) serotypes (defined as serogroups: O26, O45, O103, O111, O121, and O145). The RapidFinder STEC Detection Workflow makes use of either the automated preparation of PCR-ready DNA using the Applied Biosystems PrepSEQ™ Nucleic Acid Extraction Kit in conjunction with the Applied Biosystems MagMAX™ Express 96-well magnetic particle processor or the Applied Biosystems PrepSEQ Rapid Spin kit for manual preparation of PCR-ready DNA. Two separate assays comprise the RapidFinder STEC Detection Workflow, the Applied Biosystems RapidFinder STEC Screening Assay and the Applied Biosystems RapidFinder STEC Confirmation Assay. The RapidFinder STEC Screening Assay includes primers and probes to detect the presence of stx1 (Shiga toxin 1), stx2 (Shiga toxin 2), eae (intimin), and E. coli O157 gene targets. The RapidFinder STEC Confirmation Assay includes primers and probes for the "Big 6" non-O157 STEC and E. coli O157:H7. The use of these two assays in tandem allows a user to detect accurately the presence of the "Big 6" STECs and E. coli O157:H7. The performance of the RapidFinder STEC Detection Workflow was evaluated in a method comparison study, in inclusivity and exclusivity studies, and in a robustness evaluation. The assays were compared to the U.S. Department of Agriculture (USDA), Food Safety and Inspection Service (FSIS) Microbiology Laboratory Guidebook (MLG) 5.09: Detection, Isolation and Identification of Escherichia coli O157:H7 from Meat Products and Carcass and Environmental Sponges for raw ground beef (73% lean) and USDA/FSIS-MLG 5B.05: Detection, Isolation and Identification of Escherichia coli non-O157:H7 from Meat Products and Carcass and Environmental Sponges for raw beef trim. No statistically significant differences were observed between the reference method and the individual or combined kits forming the candidate assay using either of the DNA preparation kits (manual or automated extraction). For the inclusivity and exclusivity evaluation, the RapidFinder STEC Detection Workflow, comprising both RapidFinder STEC screening and confirmation kits, correctly identified all 50 target organism isolates and correctly excluded all 30 nontarget strains for both of the assays evaluated. The results of these studies demonstrate the sensitivity and selectivity of the RapidFinder STEC Detection Workflow for the detection of E. coli O157:H7 and the "Big 6" STEC serotypes in both raw ground beef and beef trim. The robustness testing demonstrated that minor variations in the method parameters did not impact the accuracy of the assay and highlighted the importance of following the correct incubation temperatures.
Options in virtual 3D, optical-impression-based planning of dental implants.
Reich, Sven; Kern, Thomas; Ritter, Lutz
2014-01-01
If a 3D radiograph, which in today's dentistry often consists of a CBCT dataset, is available for computerized implant planning, the 3D planning should also consider functional prosthetic aspects. In a conventional workflow, the CBCT is done with a specially produced radiopaque prosthetic setup that makes the desired prosthetic situation visible during virtual implant planning. If an exclusively digital workflow is chosen, intraoral digital impressions are taken. On these digital models, the desired prosthetic suprastructures are designed. The entire datasets are virtually superimposed by a "registration" process on the corresponding structures (teeth) in the CBCTs. Thus, both the osseous and prosthetic structures are visible in one single 3D application and make it possible to consider surgical and prosthetic aspects. After having determined the implant positions on the computer screen, a drilling template is designed digitally. According to this design (CAD), a template is printed or milled in CAM process. This template is the first physically extant product in the entire workflow. The article discusses the options and limitations of this workflow.
A Workflow to Investigate Exposure and Pharmacokinetic ...
Background: Adverse outcome pathways (AOPs) link adverse effects in individuals or populations to a molecular initiating event (MIE) that can be quantified using in vitro methods. Practical application of AOPs in chemical-specific risk assessment requires incorporation of knowledge on exposure, along with absorption, distribution, metabolism, and excretion (ADME) properties of chemicals.Objectives: We developed a conceptual workflow to examine exposure and ADME properties in relation to an MIE. The utility of this workflow was evaluated using a previously established AOP, acetylcholinesterase (AChE) inhibition.Methods: Thirty chemicals found to inhibit human AChE in the ToxCast™ assay were examined with respect to their exposure, absorption potential, and ability to cross the blood–brain barrier (BBB). Structures of active chemicals were compared against structures of 1,029 inactive chemicals to detect possible parent compounds that might have active metabolites.Results: Application of the workflow screened 10 “low-priority” chemicals of 30 active chemicals. Fifty-two of the 1,029 inactive chemicals exhibited a similarity threshold of ≥ 75% with their nearest active neighbors. Of these 52 compounds, 30 were excluded due to poor absorption or distribution. The remaining 22 compounds may inhibit AChE in vivo either directly or as a result of metabolic activation.Conclusions: The incorporation of exposure and ADME properties into the conceptual workflow e
Implementation of a fall screening program in a high risk of fracture population.
Ritchey, Katherine; Olney, Amanda; Shofer, Jane; Phelan, Elizabeth A; Matsumoto, Alvin M
2017-10-31
Fall prevention is an important way to prevent fractures in person with osteoporosis. We developed and implemented a fall screening program in the context of routine osteoporosis care. This program was found to be feasible and showed that a significant proportion of persons with osteoporosis are at risk of falling. Falls are the most common cause of fracture in persons with osteoporosis. However, osteoporosis care rarely includes assessment and prevention of falling. We thus sought to assess the feasibility of a fall screening and management program integrated into routine osteoporosis care. The program was developed and offered to patients with osteoporosis or osteopenia seen at an outpatient clinic between May 2015 and May 2016. Feasibility was measured by physical therapist time required to conduct screening and ease of integrating the screening program into the usual clinic workflow. Self-report responses and mobility testing were conducted to describe the fall and fracture risk profile of osteoporosis patients screened. Effects on fall-related care processes were assessed via chart abstraction of patient participation in fall prevention exercise. Of the 154 clinic patients who presented for a clinic visit, 68% met screening criteria and completed in two thirds of persons. Screening was completed in a third of the time typically allotted for traditional PT evaluations and did not interfere with clinic workflow. Forty percent of those screened reported falling in the last year, and over half had two or more falls in the past year. Over half reported a balance or lower extremity impairment, and over 40% were below norms on one or more performance tests. Most patients who selected a group exercise fall prevention program completed all sessions while only a quarter completed either supervised or independent home-based programs. Implementation of a fall risk screening program in an outpatient osteoporosis clinic appears feasible. A substantial proportion of people with osteoporosis screened positive for being at risk of falling, justifying integration of fall prevention into routine osteoporosis care.
Gough, Albert; Shun, Tongying; Taylor, D. Lansing; Schurdak, Mark
2016-01-01
Heterogeneity is well recognized as a common property of cellular systems that impacts biomedical research and the development of therapeutics and diagnostics. Several studies have shown that analysis of heterogeneity: gives insight into mechanisms of action of perturbagens; can be used to predict optimal combination therapies; and to quantify heterogeneity in tumors where heterogeneity is believed to be associated with adaptation and resistance. Cytometry methods including high content screening (HCS), high throughput microscopy, flow cytometry, mass spec imaging and digital pathology capture cell level data for populations of cells. However it is often assumed that the population response is normally distributed and therefore that the average adequately describes the results. A deeper understanding of the results of the measurements and more effective comparison of perturbagen effects requires analysis that takes into account the distribution of the measurements, i.e. the heterogeneity. However, the reproducibility of heterogeneous data collected on different days, and in different plates/slides has not previously been evaluated. Here we show that conventional assay quality metrics alone are not adequate for quality control of the heterogeneity in the data. To address this need, we demonstrate the use of the Kolmogorov-Smirnov statistic as a metric for monitoring the reproducibility of heterogeneity in an SAR screen, describe a workflow for quality control in heterogeneity analysis. One major challenge in high throughput biology is the evaluation and interpretation of heterogeneity in thousands of samples, such as compounds in a cell-based screen. In this study we also demonstrate that three heterogeneity indices previously reported, capture the shapes of the distributions and provide a means to filter and browse big data sets of cellular distributions in order to compare and identify distributions of interest. These metrics and methods are presented as a workflow for analysis of heterogeneity in large scale biology projects. PMID:26476369
Petrides, Athena K; Tanasijevic, Milenko J; Goonan, Ellen M; Landman, Adam B; Kantartjis, Michalis; Bates, David W; Melanson, Stacy E F
2017-10-01
Recent U.S. government regulations incentivize implementation of an electronic health record (EHR) with computerized order entry and structured results display. Many institutions have also chosen to interface their EHR to their laboratory information system (LIS). Reported long-term benefits include increased efficiency and improved quality and safety. In order to successfully implement an interfaced EHR-LIS, institutions must plan years in advance and anticipate the impact of an integrated system. It can be challenging to fully understand the technical, workflow and resource aspects and adequately prepare for a potentially protracted system implementation and the subsequent stabilization. We describe the top ten challenges that we encountered in our clinical laboratories following the implementation of an interfaced EHR-LIS and offer suggestions on how to overcome these challenges. This study was performed at a 777-bed, tertiary care center which recently implemented an interfaced EHR-LIS. Challenges were recorded during EHR-LIS implementation and stabilization and the authors describe the top ten. Our top ten challenges were selection and harmonization of test codes, detailed training for providers on test ordering, communication with EHR provider champions during the build process, fluid orders and collections, supporting specialized workflows, sufficient reports and metrics, increased volume of inpatient venipunctures, adequate resources during stabilization, unanticipated changes to laboratory workflow and ordering specimens for anatomic pathology. A few suggestions to overcome these challenges include regular meetings with clinical champions, advanced considerations of reports and metrics that will be needed, adequate training of laboratory staff on new workflows in the EHR and defining all tests including anatomic pathology in the LIS. EHR-LIS implementations have many challenges requiring institutions to adapt and develop new infrastructures. This article should be helpful to other institutions facing or undergoing a similar endeavor. Copyright © 2017 Elsevier B.V. All rights reserved.
Greco, Todd M.; Guise, Amanda J.; Cristea, Ileana M.
2016-01-01
In biological systems, proteins catalyze the fundamental reactions that underlie all cellular functions, including metabolic processes and cell survival and death pathways. These biochemical reactions are rarely accomplished alone. Rather, they involve a concerted effect from many proteins that may operate in a directed signaling pathway and/or may physically associate in a complex to achieve a specific enzymatic activity. Therefore, defining the composition and regulation of protein complexes is critical for understanding cellular functions. In this chapter, we describe an approach that uses quantitative mass spectrometry (MS) to assess the specificity and the relative stability of protein interactions. Isolation of protein complexes from mammalian cells is performed by rapid immunoaffinity purification, and followed by in-solution digestion and high-resolution mass spectrometry analysis. We employ complementary quantitative MS workflows to assess the specificity of protein interactions using label-free MS and statistical analysis, and the relative stability of the interactions using a metabolic labeling technique. For each candidate protein interaction, scores from the two workflows can be correlated to minimize nonspecific background and profile protein complex composition and relative stability. PMID:26867737
Anti-nuclear antibody screening using HEp-2 cells.
Buchner, Carol; Bryant, Cassandra; Eslami, Anna; Lakos, Gabriella
2014-06-23
The American College of Rheumatology position statement on ANA testing stipulates the use of IIF as the gold standard method for ANA screening(1). Although IIF is an excellent screening test in expert hands, the technical difficulties of processing and reading IIF slides--such as the labor intensive slide processing, manual reading, the need for experienced, trained technologists and the use of dark room--make the IIF method difficult to fit in the workflow of modern, automated laboratories. The first and crucial step towards high quality ANA screening is careful slide processing. This procedure is labor intensive, and requires full understanding of the process, as well as attention to details and experience. Slide reading is performed by fluorescent microscopy in dark rooms, and is done by trained technologists who are familiar with the various patterns, in the context of cell cycle and the morphology of interphase and dividing cells. Provided that IIF is the first line screening tool for SARD, understanding the steps to correctly perform this technique is critical. Recently, digital imaging systems have been developed for the automated reading of IIF slides. These systems, such as the NOVA View Automated Fluorescent Microscope, are designed to streamline the routine IIF workflow. NOVA View acquires and stores high resolution digital images of the wells, thereby separating image acquisition from interpretation; images are viewed an interpreted on high resolution computer monitors. It stores images for future reference and supports the operator's interpretation by providing fluorescent light intensity data on the images. It also preliminarily categorizes results as positive or negative, and provides pattern recognition for positive samples. In summary, it eliminates the need for darkroom, and automates and streamlines the IIF reading/interpretation workflow. Most importantly, it increases consistency between readers and readings. Moreover, with the use of barcoded slides, transcription errors are eliminated by providing sample traceability and positive patient identification. This results in increased patient data integrity and safety. The overall goal of this video is to demonstrate the IIF procedure, including slide processing, identification of common IIF patterns, and the introduction of new advancements to simplify and harmonize this technique.
Roth, Christopher J; Boll, Daniel T; Wall, Lisa K; Merkle, Elmar M
2010-08-01
The purpose of this investigation was to assess workflow for medical imaging studies, specifically comparing liver and knee MRI examinations by use of the Lean Six Sigma methodologic framework. The hypothesis tested was that the Lean Six Sigma framework can be used to quantify MRI workflow and to identify sources of inefficiency to target for sequence and protocol improvement. Audio-video interleave streams representing individual acquisitions were obtained with graphic user interface screen capture software in the examinations of 10 outpatients undergoing MRI of the liver and 10 outpatients undergoing MRI of the knee. With Lean Six Sigma methods, the audio-video streams were dissected into value-added time (true image data acquisition periods), business value-added time (time spent that provides no direct patient benefit but is requisite in the current system), and non-value-added time (scanner inactivity while awaiting manual input). For overall MRI table time, value-added time was 43.5% (range, 39.7-48.3%) of the time for liver examinations and 89.9% (range, 87.4-93.6%) for knee examinations. Business value-added time was 16.3% of the table time for the liver and 4.3% of the table time for the knee examinations. Non-value-added time was 40.2% of the overall table time for the liver and 5.8% for the knee examinations. Liver MRI examinations consume statistically significantly more non-value-added and business value-added times than do knee examinations, primarily because of respiratory command management and contrast administration. Workflow analyses and accepted inefficiency reduction frameworks can be applied with use of a graphic user interface screen capture program.
O'Connor, C; Kiernan, M G; Finnegan, C; O'Hara, M; Power, L; O'Connell, N H; Dunne, C P
2017-05-04
Rapid detection of patients with carbapenemase-producing Enterobacteriaceae (CPE) is essential for the prevention of nosocomial cross-transmission, allocation of isolation facilities and to protect patient safety. Here, we aimed to design a new laboratory work-flow, utilizing existing laboratory resources, in order to reduce time-to-diagnosis of CPE. A review of the current CPE testing processes and of the literature was performed to identify a real-time commercial polymerase chain reaction (PCR) assay that could facilitate batch testing of CPE clinical specimens, with adequate CPE gene coverage. Stool specimens (210) were collected; CPE-positive inpatients (n = 10) and anonymized community stool specimens (n = 200). Rectal swabs (eSwab™) were inoculated from collected stool specimens and a manual DNA extraction method (QIAamp® DNA Stool Mini Kit) was employed. Extracted DNA was then processed on the Check-Direct CPE® assay. The three step process of making the eSwab™, extracting DNA manually and running the Check-Direct CPE® assay, took <5 min, 1 h 30 min and 1 h 50 min, respectively. It was time efficient with a result available in under 4 h, comparing favourably with the existing method of CPE screening; average time-to-diagnosis of 48/72 h. Utilizing this CPE work-flow would allow a 'same-day' result. Antimicrobial susceptibility testing results, as is current practice, would remain a 'next-day' result. In conclusion, the Check-Direct CPE® assay was easily integrated into a local laboratory work-flow and could facilitate a large volume of CPE screening specimens in a single batch, making it cost-effective and convenient for daily CPE testing.
Akeroyd, Michiel; Olsthoorn, Maurien; Gerritsma, Jort; Gutker-Vermaas, Diana; Ekkelkamp, Laurens; van Rij, Tjeerd; Klaassen, Paul; Plugge, Wim; Smit, Ed; Strupat, Kerstin; Wenzel, Thibaut; van Tilborg, Marcel; van der Hoeven, Rob
2013-03-10
In the discovery of new enzymes genomic and cDNA expression libraries containing thousands of differential clones are generated to obtain biodiversity. These libraries need to be screened for the activity of interest. Removing so-called empty and redundant clones significantly reduces the size of these expression libraries and therefore speeds up new enzyme discovery. Here, we present a sensitive, generic workflow for high throughput screening of successful microbial protein over-expression in microtiter plates containing a complex matrix based on mass spectrometry techniques. MALDI-LTQ-Orbitrap screening followed by principal component analysis and peptide mass fingerprinting was developed to obtain a throughput of ∼12,000 samples per week. Alternatively, a UHPLC-MS(2) approach including MS(2) protein identification was developed for microorganisms with a complex protein secretome with a throughput of ∼2000 samples per week. TCA-induced protein precipitation enhanced by addition of bovine serum albumin is used for protein purification prior to MS detection. We show that this generic workflow can effectively reduce large expression libraries from fungi and bacteria to their minimal size by detection of successful protein over-expression using MS. Copyright © 2012 Elsevier B.V. All rights reserved.
Rapid analysis and exploration of fluorescence microscopy images.
Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason M; Steininger, Robert J; Wu, Lani F; Altschuler, Steven J
2014-03-19
Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard. Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.
Steger, Julia; Arnhard, Kathrin; Haslacher, Sandra; Geiger, Klemens; Singer, Klaus; Schlapp, Michael; Pitterl, Florian; Oberacher, Herbert
2016-04-01
Forensic toxicology and environmental water analysis share the common interest and responsibility in ensuring comprehensive and reliable confirmation of drugs and pharmaceutical compounds in samples analyzed. Dealing with similar analytes, detection and identification techniques should be exchangeable between scientific disciplines. Herein, we demonstrate the successful adaption of a forensic toxicological screening workflow employing nontargeted LC/MS/MS under data-dependent acquisition control and subsequent database search to water analysis. The main modification involved processing of an increased sample volume with SPE (500 mL vs. 1-10 mL) to reach LODs in the low ng/L range. Tandem mass spectra acquired with a qTOF instrument were submitted to database search. The targeted data mining strategy was found to be sensitive and specific; automated search produced hardly any false results. To demonstrate the applicability of the adapted workflow to complex samples, 14 wastewater effluent samples collected on seven consecutive days at the local wastewater-treatment plant were analyzed. Of the 88,970 fragment ion mass spectra produced, 8.8% of spectra were successfully assigned to one of the 1040 reference compounds included in the database, and this enabled the identification of 51 compounds representing important illegal drugs, members of various pharmaceutical compound classes, and metabolites thereof. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
In Situ Target Engagement Studies in Adherent Cells.
Axelsson, Hanna; Almqvist, Helena; Otrocka, Magdalena; Vallin, Michaela; Lundqvist, Sara; Hansson, Pia; Karlsson, Ulla; Lundbäck, Thomas; Seashore-Ludlow, Brinton
2018-04-20
A prerequisite for successful drugs is effective binding of the desired target protein in the complex environment of a living system. Drug-target engagement has typically been difficult to monitor in physiologically relevant models, and with current methods, especially, while maintaining spatial information. One recent technique for quantifying drug-target engagement is the cellular thermal shift assay (CETSA), in which ligand-induced protein stabilization is measured after a heat challenge. Here, we describe a CETSA protocol in live A431 cells for p38α (MAPK14), where remaining soluble protein is detected in situ, using high-content imaging in 384-well, microtiter plates. We validate this assay concept using a number of known p38α inhibitors and further demonstrate the potential of this technology for chemical probe and drug discovery purposes by performing a small pilot screen for novel p38α binders. Importantly, this protocol creates a workflow that is amenable to adherent cells in their native state and yields spatially resolved target engagement information measurable at the single-cell level.
NASA Astrophysics Data System (ADS)
Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut
2017-04-01
Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).
Rapid lead discovery through iterative screening of one bead one compound libraries.
Gao, Yu; Amar, Sabrina; Pahwa, Sonia; Fields, Gregg; Kodadek, Thomas
2015-01-12
Primary hits that arise from screening one bead one compound (OBOC) libraries against a target of interest rarely have high potency. However, there has been little work focused on the development of an efficient workflow for primary hit improvement. In this study, we show that by characterizing the binding constants for all of the hits that arise from a screen, structure-activity relationship (SAR) data can be obtained to inform the design of "derivative libraries" of a primary hit that can then be screened under more demanding conditions to obtain improved compounds. Here, we demonstrate the rapid improvement of a primary hit against matrix metalloproteinase-14 using this approach.
Integrated fusion simulation with self-consistent core-pedestal coupling
Meneghini, O.; Snyder, P. B.; Smith, S. P.; ...
2016-04-20
In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments.more » An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Z eff.« less
Economic and workflow analysis of a blood bank automated system.
Shin, Kyung-Hwa; Kim, Hyung Hoi; Chang, Chulhun L; Lee, Eun Yup
2013-07-01
This study compared the estimated costs and times required for ABO/Rh(D) typing and unexpected antibody screening using an automated system and manual methods. The total cost included direct and labor costs. Labor costs were calculated on the basis of the average operator salaries and unit values (minutes), which was the hands-on time required to test one sample. To estimate unit values, workflows were recorded on video, and the time required for each process was analyzed separately. The unit values of ABO/Rh(D) typing using the manual method were 5.65 and 8.1 min during regular and unsocial working hours, respectively. The unit value was less than 3.5 min when several samples were tested simultaneously. The unit value for unexpected antibody screening was 2.6 min. The unit values using the automated method for ABO/Rh(D) typing, unexpected antibody screening, and both simultaneously were all 1.5 min. The total cost of ABO/Rh(D) typing of only one sample using the automated analyzer was lower than that of testing only one sample using the manual technique but higher than that of testing several samples simultaneously. The total cost of unexpected antibody screening using an automated analyzer was less than that using the manual method. ABO/Rh(D) typing using an automated analyzer incurs a lower unit value and cost than that using the manual technique when only one sample is tested at a time. Unexpected antibody screening using an automated analyzer always incurs a lower unit value and cost than that using the manual technique.
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Background: Adverse outcome pathways (AOPs) link adverse effects in individuals or populations to a molecular initiating event (MIE) that can be quantified using in vitro methods. Practical application of AOPs in chemical-specific risk assessment requires incorporation of knowled...
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Workflow and Proof of Concept for Non-Targeted Analysis of Environmental Samples by LC-MS/MS
The human exposure includes thousands of chemicals acquired through various routes of exposure such as inhalation, ingestion, dermal contact, and indirect ingestion. Rapid assessment and screening of these chemicals is a difficult challenge facing EPA in its mission to protect pu...
Application of Functional Use Predictions to Aid in Structure ...
Humans are potentially exposed to thousands of anthropogenic chemicals in commerce. Recent work has shown that the bulk of this exposure may occur in near-field indoor environments (e.g., home, school, work, etc.). Advances in suspect screening analyses (SSA) now allow an improved understanding of the chemicals present in these environments. However, due to the nature of suspect screening techniques, investigators are often left with chemical formula predictions, with the possibility of many chemical structures matching to each formula. Here, newly developed quantitative structure-use relationship (QSUR) models are used to identify potential exposure sources for candidate structures. Previously, a suspect screening workflow was introduced and applied to house dust samples collected from the U.S. Department of Housing and Urban Development’s American Healthy Homes Survey (AHHS) [Rager, et al., Env. Int. 88 (2016)]. This workflow utilized the US EPA’s Distributed Structure-Searchable Toxicity (DSSTox) Database to link identified molecular features to molecular formulas, and ultimately chemical structures. Multiple QSUR models were applied to support the evaluation of candidate structures. These QSURs predict the likelihood of a chemical having a functional use commonly associated with consumer products having near-field use. For 3,228 structures identified as possible chemicals in AHHS house dust samples, we were able to obtain the required descriptors to appl
Predictive QSAR modeling workflow, model applicability domains, and virtual screening.
Tropsha, Alexander; Golbraikh, Alexander
2007-01-01
Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.
Jensen, Roxanne E.; Rothrock, Nan E.; DeWitt, Esi Morgan; Spiegel, Brennan; Tucker, Carole A.; Crane, Heidi M.; Forrest, Christopher B.; Patrick, Donald L.; Fredericksen, Rob; Shulman, Lisa M.; Cella, David; Crane, Paul K.
2016-01-01
Background Patient-reported outcomes (PROs) are gaining recognition as key measures for improving the quality of patient care in clinical care settings. Three factors have made the implementation of PROs in clinical care more feasible: increased use of modern measurement methods in PRO design and validation, rapid progression of technology (e.g., touch screen tablets, Internet accessibility, and electronic health records (EHRs)), and greater demand for measurement and monitoring of PROs by regulators, payers, accreditors, and professional organizations. As electronic PRO collection and reporting capabilities have improved, the challenges of collecting PRO data have changed. Objectives To update information on PRO adoption considerations in clinical care, highlighting electronic and technical advances with respect to measure selection, clinical workflow, data infrastructure, and outcomes reporting. Methods Five practical case studies across diverse healthcare settings and patient populations are used to explore how implementation barriers were addressed to promote the successful integration of PRO collection into the clinical workflow. The case studies address selecting and reporting of relevant content, workflow integration, pre-visit screening, effective evaluation, and EHR integration. Conclusions These case studies exemplify elements of well-designed electronic systems, including response automation, tailoring of item selection and reporting algorithms, flexibility of collection location, and integration with patient health care data elements. They also highlight emerging logistical barriers in this area, such as the need for specialized technological and methodological expertise, and design limitations of current electronic data capture systems. PMID:25588135
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheu, R; Ghafar, R; Powers, A
Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determinedmore » by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.« less
Application research of rail transit safety protection based on laser detection
NASA Astrophysics Data System (ADS)
Wang, Zhifei
2016-10-01
Platform screen door can not only prevent the passengers fell or jumped the track danger, to passengers bring comfortable waiting environment, but also has the function of environmental protection and energy saving. But platform screen door and train the full-length gap region is insecure in the system design of a hidden, such as passengers for some reason (grab the train) in the interstitial region retention, is sandwiched between the intercity safety door and the door, and such as the region lacks security detection and alarm system, once the passengers in the gap region retention (caught), bring more serious threat to the safety of passengers and traffic safety. This paper from the point of view of the design presents the physical, infrared, laser three safety protection device setting schemes. Domestic intelligence of between rail transit shield door and train security clearance processing used is screen door system standard configuration, the obstacle detection function for avoid passengers stranded in the clearance has strong prevention function. Laser detection research and development projects can access to prevent shield door and train gap clamp safety measures. Rail safety protection method are studied applying laser detection technique. According to the laser reflection equation of foreign body, the characteristics of laser detection of foreign bodies are given in theory. By using statistical analysis method, the workflow of laser detection system is established. On this basis, protection methods is proposed. Finally the simulation and test results show that the laser detection technology in the rail traffic safety protection reliability and stability, And the future laser detection technology in is discussed the development of rail transit.
Background: Accurate prediction of in vivo toxicity from in vitro testing is a challenging problem. Large public–private consortia have been formed with the goal of improving chemical safety assessment by the means of high-throughput screening. Methods and results: A database co...
Usability Testing and Workflow Analysis of the TRADOC Data Visualization Tool
2012-09-01
software such as blink data, saccades, and cognitive load based on pupil contraction. Eye-tracking was only a component of the data evaluated and as...line charts were a pain to read) Yes Yes Projecting the charts directly onto the regions increased clutter on the screen and is a bad stylistic
Bauer, Matthias R; Ibrahim, Tamer M; Vogel, Simon M; Boeckler, Frank M
2013-06-24
The application of molecular benchmarking sets helps to assess the actual performance of virtual screening (VS) workflows. To improve the efficiency of structure-based VS approaches, the selection and optimization of various parameters can be guided by benchmarking. With the DEKOIS 2.0 library, we aim to further extend and complement the collection of publicly available decoy sets. Based on BindingDB bioactivity data, we provide 81 new and structurally diverse benchmark sets for a wide variety of different target classes. To ensure a meaningful selection of ligands, we address several issues that can be found in bioactivity data. We have improved our previously introduced DEKOIS methodology with enhanced physicochemical matching, now including the consideration of molecular charges, as well as a more sophisticated elimination of latent actives in the decoy set (LADS). We evaluate the docking performance of Glide, GOLD, and AutoDock Vina with our data sets and highlight existing challenges for VS tools. All DEKOIS 2.0 benchmark sets will be made accessible at http://www.dekois.com.
Stability and Scalability of the CMS Global Pool: Pushing HTCondor and GlideinWMS to New Limits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balcas, J.; Bockelman, B.; Hufnagel, D.
The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such asmore » multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.« less
Stability and scalability of the CMS Global Pool: Pushing HTCondor and glideinWMS to new limits
NASA Astrophysics Data System (ADS)
Balcas, J.; Bockelman, B.; Hufnagel, D.; Hurtado Anampa, K.; Aftab Khan, F.; Larson, K.; Letts, J.; Marra da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.
2017-10-01
The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such as multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.
Lessons Learned From A Study Of Genomics-Based Carrier Screening For Reproductive Decision Making.
Wilfond, Benjamin S; Kauffman, Tia L; Jarvik, Gail P; Reiss, Jacob A; Richards, C Sue; McMullen, Carmit; Gilmore, Marian; Himes, Patricia; Kraft, Stephanie A; Porter, Kathryn M; Schneider, Jennifer L; Punj, Sumit; Leo, Michael C; Dickerson, John F; Lynch, Frances L; Clarke, Elizabeth; Rope, Alan F; Lutz, Kevin; Goddard, Katrina A B
2018-05-01
Genomics-based carrier screening is one of many opportunities to use genomic information to inform medical decision making, but clinicians, health care delivery systems, and payers need to determine whether to offer screening and how to do so in an efficient, ethical way. To shed light on this issue, we conducted a study in the period 2014-17 to inform the design of clinical screening programs and guide further health services research. Many of our results have been published elsewhere; this article summarizes the lessons we learned from that study and offers policy insights. Our experience can inform understanding of the potential impact of expanded carrier screening services on health system workflows and workforces-impacts that depend on the details of the screening approach. We found limited patient or health system harms from expanded screening. We also found that some patients valued the information they learned from the process. Future policy discussions should consider the value of offering such expanded carrier screening in health delivery systems with limited resources.
Zdrazil, B.; Neefs, J.-M.; Van Vlijmen, H.; Herhaus, C.; Caracoti, A.; Brea, J.; Roibás, B.; Loza, M. I.; Queralt-Rosinach, N.; Furlong, L. I.; Gaulton, A.; Bartek, L.; Senger, S.; Chichester, C.; Engkvist, O.; Evelo, C. T.; Franklin, N. I.; Marren, D.; Ecker, G. F.
2016-01-01
Phenotypic screening is in a renaissance phase and is expected by many academic and industry leaders to accelerate the discovery of new drugs for new biology. Given that phenotypic screening is per definition target agnostic, the emphasis of in silico and in vitro follow-up work is on the exploration of possible molecular mechanisms and efficacy targets underlying the biological processes interrogated by the phenotypic screening experiments. Herein, we present six exemplar computational protocols for the interpretation of cellular phenotypic screens based on the integration of compound, target, pathway, and disease data established by the IMI Open PHACTS project. The protocols annotate phenotypic hit lists and allow follow-up experiments and mechanistic conclusions. The annotations included are from ChEMBL, ChEBI, GO, WikiPathways and DisGeNET. Also provided are protocols which select from the IUPHAR/BPS Guide to PHARMACOLOGY interaction file selective compounds to probe potential targets and a correlation robot which systematically aims to identify an overlap of active compounds in both the phenotypic as well as any kinase assay. The protocols are applied to a phenotypic pre-lamin A/C splicing assay selected from the ChEMBL database to illustrate the process. The computational protocols make use of the Open PHACTS API and data and are built within the Pipeline Pilot and KNIME workflow tools. PMID:27774140
Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin
2014-06-05
In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.
2014-01-01
Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483
Implementation and Challenges of Direct Acoustic Dosing into Cell-Based Assays.
Roberts, Karen; Callis, Rowena; Ikeda, Tim; Paunovic, Amalia; Simpson, Carly; Tang, Eric; Turton, Nick; Walker, Graeme
2016-02-01
Since the adoption of Labcyte Echo Acoustic Droplet Ejection (ADE) technology by AstraZeneca in 2005, ADE has become the preferred method for compound dosing into both biochemical and cell-based assays across AstraZeneca research and development globally. The initial implementation of Echos and the direct dosing workflow provided AstraZeneca with a unique set of challenges. In this article, we outline how direct Echo dosing has evolved over the past decade in AstraZeneca. We describe the practical challenges of applying ADE technology to 96-well, 384-well, and 1536-well assays and how AstraZeneca developed and applied software and robotic solutions to generate fully automated and effective cell-based assay workflows. © 2015 Society for Laboratory Automation and Screening.
Uematsu, Takayoshi
2017-01-01
This article discusses possible supplemental breast cancer screening modalities for younger women with dense breasts from a perspective of population-based breast cancer screening program in Japan. Supplemental breast cancer screening modalities have been proposed to increase the sensitivity and detection rates of early stage breast cancer in women with dense breasts; however, there are no global guidelines that recommend the use of supplemental breast cancer screening modalities in such women. Also, no criterion standard exists for breast density assessment. Based on the current situation of breast imaging in Japan, the possible supplemental breast cancer screening modalities are ultrasonography, digital breast tomosynthesis, and breast magnetic resonance imaging. An appropriate population-based breast cancer screening program based on the balance between cost and benefit should be a high priority. Further research based on evidence-based medicine is encouraged. It is very important that the ethnicity, workforce, workflow, and resources for breast cancer screening in each country should be considered when considering supplemental breast cancer screening modalities for women with dense breasts.
2008-08-05
Research in HLA Typing, Hematopoietic Stem Cell Transplantation and Clinical Studies to Improve Outcomes 16. SECURITY CLASSIFICATION OF: 19a. NAME...new action item was added to Workflow Management screen for the SCTOD ( Stem Cell Therapeutic Outcomes Data) Data Form. The information will be passed...Improvement Amendment NRP National Response Plan CME Continuing Medical Education NST Non-myeloablative Allogeneic Stem Cell Transplantation COG
Adverse outcome pathways (AOP) link known population outcomes to a molecular initiating event (MIE) that can be quantified using high-throughput in vitro methods. Practical application of AOPs in chemical-specific risk assessment requires consideration of exposure and absorption,...
Birchler, Axel; Berger, Mischa; Jäggin, Verena; Lopes, Telma; Etzrodt, Martin; Misun, Patrick Mark; Pena-Francesch, Maria; Schroeder, Timm; Hierlemann, Andreas; Frey, Olivier
2016-01-19
Open microfluidic cell culturing devices offer new possibilities to simplify loading, culturing, and harvesting of individual cells or microtissues due to the fact that liquids and cells/microtissues are directly accessible. We present a complete workflow for microfluidic handling and culturing of individual cells and microtissue spheroids, which is based on the hanging-drop network concept: The open microfluidic devices are seamlessly combined with fluorescence-activated cell sorting (FACS), so that individual cells, including stem cells, can be directly sorted into specified culturing compartments in a fully automated way and at high accuracy. Moreover, already assembled microtissue spheroids can be loaded into the microfluidic structures by using a conventional pipet. Cell and microtissue culturing is then performed in hanging drops under controlled perfusion. On-chip drop size control measures were applied to stabilize the system. Cells and microtissue spheroids can be retrieved from the chip by using a parallelized transfer method. The presented methodology holds great promise for combinatorial screening of stem-cell and multicellular-spheroid cultures.
Can Untargeted Metabolomics Be Utilized in Drug Discovery/Development?
Caldwell, Gary W; Leo, Gregory C
2017-01-01
Untargeted metabolomics is a promising approach for reducing the significant attrition rate for discovering and developing drugs in the pharmaceutical industry. This review aims to highlight the practical decision-making value of untargeted metabolomics for the advancement of drug candidates in drug discovery/development including potentially identifying and validating novel therapeutic targets, creating alternative screening paradigms, facilitating the selection of specific and translational metabolite biomarkers, identifying metabolite signatures for the drug efficacy mechanism of action, and understanding potential drug-induced toxicity. The review provides an overview of the pharmaceutical process workflow to discover and develop new small molecule drugs followed by the metabolomics process workflow that is involved in conducting metabolomics studies. The pros and cons of the major components of the pharmaceutical and metabolomics workflows are reviewed and discussed. Finally, selected untargeted metabolomics literature examples, from primarily 2010 to 2016, are used to illustrate why, how, and where untargeted metabolomics can be integrated into the drug discovery/preclinical drug development process. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
O’Connor, Anne; Brasher, Christopher J.; Slatter, David A.; Meckelmann, Sven W.; Hawksworth, Jade I.; Allen, Stuart M.; O’Donnell, Valerie B.
2017-01-01
Accurate and high-quality curation of lipidomic datasets generated from plasma, cells, or tissues is becoming essential for cell biology investigations and biomarker discovery for personalized medicine. However, a major challenge lies in removing artifacts otherwise mistakenly interpreted as real lipids from large mass spectrometry files (>60 K features), while retaining genuine ions in the dataset. This requires powerful informatics tools; however, available workflows have not been tailored specifically for lipidomics, particularly discovery research. We designed LipidFinder, an open-source Python workflow. An algorithm is included that optimizes analysis based on users’ own data, and outputs are screened against online databases and categorized into LIPID MAPS classes. LipidFinder outperformed three widely used metabolomics packages using data from human platelets. We show a family of three 12-hydroxyeicosatetraenoic acid phosphoinositides (16:0/, 18:1/, 18:0/12-HETE-PI) generated by thrombin-activated platelets, indicating crosstalk between eicosanoid and phosphoinositide pathways in human cells. The software is available on GitHub (https://github.com/cjbrasher/LipidFinder), with full userguides. PMID:28405621
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
Bernini, Andrea; Galderisi, Silvia; Spiga, Ottavia; Bernardini, Giulia; Niccolai, Neri; Manetti, Fabrizio; Santucci, Annalisa
2017-10-01
Alkaptonuria (AKU) is an inborn error of metabolism where mutation of homogentisate 1,2-dioxygenase (HGD) gene leads to a deleterious or misfolded product with subsequent loss of enzymatic degradation of homogentisic acid (HGA) whose accumulation in tissues causes ochronosis and degeneration. There is no licensed therapy for AKU. Many missense mutations have been individuated as responsible for quaternary structure disruption of the native hexameric HGD. A new approach to the treatment of AKU is here proposed aiming to totally or partially rescue enzyme activity by targeting of HGD with pharmacological chaperones, i.e. small molecules helping structural stability. Co-factor pockets from oligomeric proteins have already been successfully exploited as targets for such a strategy, but no similar sites are present at HGD surface; hence, transient pockets are here proposed as a target for pharmacological chaperones. Transient pockets are detected along the molecular dynamics trajectory of the protein and filtered down to a set of suitable sites for structural stabilization by mean of biochemical and pharmacological criteria. The result is a computational workflow relevant to other inborn errors of metabolism requiring rescue of oligomeric, misfolded enzymes. Copyright © 2017 Elsevier Ltd. All rights reserved.
An XML Representation for Crew Procedures
NASA Technical Reports Server (NTRS)
Simpson, Richard C.
2005-01-01
NASA ensures safe operation of complex systems through the use of formally-documented procedures, which encode the operational knowledge of the system as derived from system experts. Crew members use procedure documentation on the ground for training purposes and on-board space shuttle and space station to guide their activities. Investigators at JSC are developing a new representation for procedures that is content-based (as opposed to display-based). Instead of specifying how a procedure should look on the printed page, the content-based representation will identify the components of a procedure and (more importantly) how the components are related (e.g., how the activities within a procedure are sequenced; what resources need to be available for each activity). This approach will allow different sets of rules to be created for displaying procedures on a computer screen, on a hand-held personal digital assistant (PDA), verbally, or on a printed page, and will also allow intelligent reasoning processes to automatically interpret and use procedure definitions. During his NASA fellowship, Dr. Simpson examined how various industries represent procedures (also called business processes or workflows), in areas such as manufacturing, accounting, shipping, or customer service. A useful method for designing and evaluating workflow representation languages is by determining their ability to encode various workflow patterns, which depict abstract relationships between the components of a procedure removed from the context of a specific procedure or industry. Investigators have used this type of analysis to evaluate how well-suited existing workflow representation languages are for various industries based on the workflow patterns that commonly arise across industry-specific procedures. Based on this type of analysis, it is already clear that existing workflow representations capture discrete flow of control (i.e., when one activity should start and stop based on when other activities start and stop), but do not capture the flow of data, materials, resources or priorities. Existing workflow representation languages are also limited to representing sequences of discrete activities, and cannot encode procedures involving continuous flow of information or materials between activities.
Hydrogen storage materials discovery via high throughput ball milling and gas sorption.
Li, Bin; Kaye, Steven S; Riley, Conor; Greenberg, Doron; Galang, Daniel; Bailey, Mark S
2012-06-11
The lack of a high capacity hydrogen storage material is a major barrier to the implementation of the hydrogen economy. To accelerate discovery of such materials, we have developed a high-throughput workflow for screening of hydrogen storage materials in which candidate materials are synthesized and characterized via highly parallel ball mills and volumetric gas sorption instruments, respectively. The workflow was used to identify mixed imides with significantly enhanced absorption rates relative to Li2Mg(NH)2. The most promising material, 2LiNH2:MgH2 + 5 atom % LiBH4 + 0.5 atom % La, exhibits the best balance of absorption rate, capacity, and cycle-life, absorbing >4 wt % H2 in 1 h at 120 °C after 11 absorption-desorption cycles.
Becságh, Péter; Szakács, Orsolya
2014-10-01
During diagnostic workflow when detecting sequence alterations, sometimes it is important to design an algorithm that includes screening and direct tests in combination. Normally the use of direct test, which is mainly sequencing, is limited. There is an increased need for effective screening tests, with "closed tube" during the whole process and therefore decreasing the risk of PCR product contamination. The aim of this study was to design such a closed tube, detection probe based screening assay to detect different kind of sequence alterations in the exon 11 of the human c-kit gene region. Inside this region there are variable possible deletions and single nucleotide changes. During assay setup, more probe chemistry formats were screened and tested. After some optimization steps the taqman probe format was selected.
A central aim of EPA’s ToxCast project is to use in vitro high-throughput screening (HTS) profiles to build predictive models of in vivo toxicity. Where assays lack metabolic capability, such efforts may need to anticipate the role of metabolic activation (or deactivation). A wo...
Wegh, Robin S; Berendsen, Bjorn J A; Driessen-Van Lankveld, Wilma D M; Pikkemaat, Mariël G; Zuidema, Tina; Van Ginkel, Leen A
2017-11-01
A non-targeted workflow is reported for the isolation and identification of antimicrobial active compounds using bioassay-directed screening and LC coupled to high-resolution MS. Suspect samples are extracted using a generic protocol and fractionated using two different LC conditions (A and B). The behaviour of the bioactive compound under these different conditions yields information about the physicochemical properties of the compound and introduces variations in co-eluting compounds in the fractions, which is essential for peak picking and identification. The fractions containing the active compound(s) obtained with conditions A and B are selected using a microbiological effect-based bioassay. The selected bioactive fractions from A and B are analysed using LC combined with high-resolution MS. Selection of relevant signals is automatically carried out by selecting all signals present in both bioactive fractions A and B, yielding tremendous data reduction. The method was assessed using two spiked feed samples and subsequently applied to two feed samples containing an unidentified compound showing microbial growth inhibition. In all cases, the identity of the compound causing microbiological inhibition was successfully confirmed.
Siow, Hwee-Leng; Lim, Theam Soon; Gan, Chee-Yuen
2017-01-01
The main objective of this study was to develop an efficient workflow to discover α-amylase inhibitory peptides from cumin seed. A total of 56 unknown peptides was initially found in the cumin seed protein hydrolysate. They were subjected to 2 different in silico screenings and 6 peptides were shortlisted. The peptides were then subjected to in vitro selection using phage display technique and 3 clones (CSP3, CSP4 and CSP6) showed high affinity in binding α-amylase. These clones were subjected to the inhibitory test and only CSP4 and CSP6 exhibited high inhibitory activity. Therefore, these peptides were chemically synthesized for validation purposes. CSP4 exhibited inhibition of bacterial and human salivary α-amylases with IC50 values of 0.11 and 0.04μmol, respectively, whereas CSP6 was about 0.10 and 0.15μmol, respectively. Results showed that the strength of each protocol has been successfully combined as deemed fit to enhance the α-amylase inhibitor peptide discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ortalli, Margherita; Attard, Luciano; Vanino, Elisa; Gaibani, Paolo; Vocale, Caterina; Rossini, Giada; Cagarelli, Roberto; Pierro, Anna; Billi, Patrizia; Mastroianni, Antonio; Di Cesare, Simona; Codeluppi, Mauro; Franceschini, Erica; Melchionda, Fraia; Gramiccia, Marina; Scalone, Aldo; Gentilomi, Giovanna A.; Landini, Maria P.
2017-01-01
The diagnosis of visceral leishmaniasis (VL) remains challenging, due to the limited sensitivity of microscopy, the poor performance of serological methods in immunocompromised patients and the lack of standardization of molecular tests. The aim of this study was to implement a combined diagnostic workflow by integrating serological and molecular tests with standardized clinical criteria. Between July 2013 and June 2015, the proposed workflow was applied to specimens obtained from 94 in-patients with clinical suspicion of VL in the Emilia-Romagna region, Northern Italy. Serological tests and molecular techniques were employed. Twenty-one adult patients (22%) had a confirmed diagnosis of VL by clinical criteria, serology and/or real-time polymerase chain reaction; 4 of these patients were HIV-positive. Molecular tests exhibited higher sensitivity than serological tests for the diagnosis of VL. In our experience, the rK39 immunochromatographic test was insufficiently sensitive for use as a screening test for the diagnosis of VL caused by L. infantum in Italy. However, as molecular tests are yet not standardized, further studies are required to identify an optimal screening test for Mediterranean VL. PMID:28832646
Howard, Barbara J; Sturner, Raymond
2017-12-01
To describe benefits and problems with screening and addressing developmental and behavioral problems in primary care and using an online clinical process support system as a solution. Screening has been found to have various implementation barriers including time costs, accuracy, workflow and knowledge of tools. In addition, training of clinicians in dealing with identified issues is lacking. Patients disclose more to and prefer computerized screening. An online clinical process support system (CHADIS) shows promise in addressing these issues. Use of a comprehensive panel of online pre-visit screens; linked decision support to provide moment-of-care training; and post-visit activities and resources for patient-specific education, monitoring and care coordination is an efficient way to make the entire process of screening and follow up care feasible in primary care. CHADIS fulfills these requirements and provides Maintenance of Certification credit to physicians as well as added income for screening efforts.
An interactive web-based application for Comprehensive Analysis of RNAi-screen Data.
Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B; Germain, Ronald N; Smith, Jennifer A; Simpson, Kaylene J; Martin, Scott E; Buehler, Eugen; Beuhler, Eugen; Fraser, Iain D C
2016-02-23
RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment.
An interactive web-based application for Comprehensive Analysis of RNAi-screen Data
Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B.; Germain, Ronald N.; Smith, Jennifer A.; Simpson, Kaylene J.; Martin, Scott E.; Beuhler, Eugen; Fraser, Iain D. C.
2016-01-01
RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment. PMID:26902267
Graeber, Kai; Linkies, Ada; Wood, Andrew T.A.; Leubner-Metzger, Gerhard
2011-01-01
Comparative biology includes the comparison of transcriptome and quantitative real-time RT-PCR (qRT-PCR) data sets in a range of species to detect evolutionarily conserved and divergent processes. Transcript abundance analysis of target genes by qRT-PCR requires a highly accurate and robust workflow. This includes reference genes with high expression stability (i.e., low intersample transcript abundance variation) for correct target gene normalization. Cross-species qRT-PCR for proper comparative transcript quantification requires reference genes suitable for different species. We addressed this issue using tissue-specific transcriptome data sets of germinating Lepidium sativum seeds to identify new candidate reference genes. We investigated their expression stability in germinating seeds of L. sativum and Arabidopsis thaliana by qRT-PCR, combined with in silico analysis of Arabidopsis and Brassica napus microarray data sets. This revealed that reference gene expression stability is higher for a given developmental process between distinct species than for distinct developmental processes within a given single species. The identified superior cross-species reference genes may be used for family-wide comparative qRT-PCR analysis of Brassicaceae seed germination. Furthermore, using germinating seeds, we exemplify optimization of the qRT-PCR workflow for challenging tissues regarding RNA quality, transcript stability, and tissue abundance. Our work therefore can serve as a guideline for moving beyond Arabidopsis by establishing high-quality cross-species qRT-PCR. PMID:21666000
Integrated Modeling of Time Evolving 3D Kinetic MHD Equilibria and NTV Torque
NASA Astrophysics Data System (ADS)
Logan, N. C.; Park, J.-K.; Grierson, B. A.; Haskey, S. R.; Nazikian, R.; Cui, L.; Smith, S. P.; Meneghini, O.
2016-10-01
New analysis tools and integrated modeling of plasma dynamics developed in the OMFIT framework are used to study kinetic MHD equilibria evolution on the transport time scale. The experimentally observed profile dynamics following the application of 3D error fields are described using a new OMFITprofiles workflow that directly addresses the need for rapid and comprehensive analysis of dynamic equilibria for next-step theory validation. The workflow treats all diagnostic data as fundamentally time dependent, provides physics-based manipulations such as ELM phase data selection, and is consistent across multiple machines - including DIII-D and NSTX-U. The seamless integration of tokamak data and simulation is demonstrated by using the self-consistent kinetic EFIT equilibria and profiles as input into 2D particle, momentum and energy transport calculations using TRANSP as well as 3D kinetic MHD equilibrium stability and neoclassical transport modeling using General Perturbed Equilibrium Code (GPEC). The result is a smooth kinetic stability and NTV torque evolution over transport time scales. Work supported by DE-AC02-09CH11466.
Shanahan, C W; Sorensen-Alawad, A; Carney, B L; Persand, I; Cruz, A; Botticelli, M; Pressman, K; Adams, W G; Brolin, M; Alford, D P
2014-01-01
The Massachusetts Screening, Brief Intervention and Referral to Treatment (MASBIRT) Program, a substance use screening program in general medical settings, created a web-based, point-of-care (POC), application--the MASBIRT Portal (the "Portal") to meet program goals. We report on development and implementation of the Portal. Five year program process outcomes recorded by an independent evaluator and an anonymous survey of Health Educator's (HEs) adoption, perceptions and Portal use with a modified version of the Technology Readiness Index are described. [8] Specific management team members, selected based on their roles in program leadership, development and implementation of the Portal and supervision of HEs, participated in semi-structured, qualitative interviews. At the conclusion of the program 73% (24/33) of the HEs completed a survey on their experience using the Portal. HEs reported that the Portal made recording screening information easy (96%); improved planning their workday (83%); facilitated POC data collection (84%); decreased time dedicated to data entry (100%); and improved job satisfaction (59%). The top two barriers to use were "no or limited wireless connectivity" (46%) and "the tablet was too heavy/bulky to carry" (29%). Qualitative management team interviews identified strategies for successful HIT implementation: importance of engaging HEs in outlining specifications and workflow needs, collaborative testing prior to implementation and clear agreement on data collection purpose, quality requirements and staff roles. Overall, HEs perceived the Portal favorably with regard to time saving ability and improved workflow. Lessons learned included identifying core requirements early during system development and need for managers to institute and enforce consistent behavioral work norms. Barriers and HEs' views of technology impacted the utilization of the MASBIRT Portal. Further research is needed to determine best approaches for HIT system implementation in general medical settings.
Xu, Zhenzhen; Li, Jianzhong; Chen, Ailiang; Ma, Xin; Yang, Shuming
2018-05-03
The retrospectivity (the ability to retrospect to a previously unknown compound in raw data) is very meaningful for food safety and risk assessment when facing new emerging drugs. Accurate mass and retention time based screening may lead false positive and false negative results so new retrospective, reliable platform is desirable. Different concentration levels of standards with and without matrix were analyzed using ion mobility (IM)-quadrupole-time-of-flight (Q-TOF) for collecting retrospective accurate mass, retention time, drift time and tandem MS evidence for identification in a single experiment. The isomer separation ability of IM and the four-dimensional (4D) feature abundance quantification abilities were evaluated for veterinary drugs for the first time. The sensitivity of the IM-Q-TOF workflow was obviously higher than that of the traditional database searching algorithm [find by formula (FbF) function] for Q-TOF. In addition, the IM-Q-TOF workflow contained most of the results from FbF and removed the false positive results. Some isomers were separated by IM and the 4D feature abundance quantitation removed interference with similar accurate mass and showed good linearity. A new retrospective, multi-evidence platform was built for veterinary drug screening in a single experiment. The sensitivity was significantly improved and the data can be used for quantification. The platform showed its potential to be used for food safety and risk assessment. This article is protected by copyright. All rights reserved.
Droplet microfluidic technology for single-cell high-throughput screening.
Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L
2009-08-25
We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.
Study on user interface of pathology picture archiving and communication system.
Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom
2014-01-01
It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.
Damoiseaux, Robert
2014-05-01
The Molecular Screening Shared Resource (MSSR) offers a comprehensive range of leading-edge high throughput screening (HTS) services including drug discovery, chemical and functional genomics, and novel methods for nano and environmental toxicology. The MSSR is an open access environment with investigators from UCLA as well as from the entire globe. Industrial clients are equally welcome as are non-profit entities. The MSSR is a fee-for-service entity and does not retain intellectual property. In conjunction with the Center for Environmental Implications of Nanotechnology, the MSSR is unique in its dedicated and ongoing efforts towards high throughput toxicity testing of nanomaterials. In addition, the MSSR engages in technology development eliminating bottlenecks from the HTS workflow and enabling novel assays and readouts currently not available.
Stability and Change of Behavioral and Emotional Screening Scores
ERIC Educational Resources Information Center
Dever, Bridget V.; Dowdy, Erin; Raines, Tara C.; Carnazzo, Katherine
2015-01-01
Universal screening for behavioral and emotional difficulties is integral to the identification of students needing early intervention and prevention efforts. However, unanswered questions regarding the stability of screening scores impede the ability to determine optimal strategies for subsequent screening. This study examined the 2-year…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase Qishi; Zhu, Michelle Mengxia
The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less
Marcano-Belisario, José S; Gupta, Ajay K; O'Donoghue, John; Ramchandani, Paul; Morrison, Cecily; Car, Josip
2017-05-10
Mobile devices may facilitate depression screening in the waiting area of antenatal clinics. This can present implementation challenges, of which we focused on survey layout and technology deployment. We assessed the feasibility of using tablet computers to administer a socio-demographic survey, the Whooley questions and the Edinburgh Postnatal Depression Scale (EPDS) to 530 pregnant women attending National Health Service (NHS) antenatal clinics across England. We randomised participants to one of two layout versions of these surveys: (i) a scrolling layout where each survey was presented on a single screen; or (ii) a paging layout where only one question appeared on the screen at any given time. Overall, 85.10% of eligible pregnant women agreed to take part. Of these, 90.95% completed the study procedures. Approximately 23% of participants answered Yes to at least one Whooley question, and approximately 13% of them scored 10 points of more on the EPDS. We observed no association between survey layout and the responses given to the Whooley questions, the median EPDS scores, the number of participants at increased risk of self-harm, and the number of participants asking for technical assistance. However, we observed a difference in the number of participants at each EPDS scoring interval (p = 0.008), which provide an indication of a woman's risk of depression. A scrolling layout resulted in faster completion times (median = 4 min 46 s) than a paging layout (median = 5 min 33 s) (p = 0.024). However, the clinical significance of this difference (47.5 s) is yet to be determined. Tablet computers can be used for depression screening in the waiting area of antenatal clinics. This requires the careful consideration of clinical workflows, and technology-related issues such as connectivity and security. An association between survey layout and EPDS scoring intervals needs to be explored further to determine if it corresponds to a survey layout effect. Future research needs to evaluate the effect of this type of antenatal depression screening on clinical outcomes and clinic workflows. This study was registered in ClinicalTrials.gov under the identifier NCT02516982 on 20 July 2015.
Letzel, Thomas; Bayer, Anne; Schulz, Wolfgang; Heermann, Alexandra; Lucke, Thomas; Greco, Giorgia; Grosse, Sylvia; Schüssler, Walter; Sengl, Manfred; Letzel, Marion
2015-10-01
A large number of anthropogenic trace contaminants such as pharmaceuticals, their human metabolites and further transformation products (TPs) enter wastewater treatment plants on a daily basis. A mixture of known, expected, and unknown molecules are discharged into the receiving aquatic environment because only partial elimination occurs for many of these chemicals during physical, biological and chemical treatment processes. In this study, an array of LC-MS methods from three collaborating laboratories was applied to detect and identify anthropogenic trace contaminants and their TPs in different waters. Starting with theoretical predictions of TPs, an efficient workflow using the combination of target, suspected-target and non-target strategies for the identification of these TPs in the environment was developed. These techniques and strategies were applied to study anti-hypertensive drugs from the sartan group (i.e., candesartan, eprosartan, irbesartan, olmesartan, and valsartan). Degradation experiments were performed in lab-scale wastewater treatment plants, and a screening workflow including an inter-laboratory approach was used for the identification of transformation products in the effluent samples. Subsequently, newly identified compounds were successfully analyzed in effluents of real wastewater treatment plants and river waters. Copyright © 2015 Elsevier Ltd. All rights reserved.
Implementation of Systematic Review Tools in IRIS | Science ...
Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view
2017-01-01
Cytochrome P450 aromatase (CYP19A1) plays a key role in the development of estrogen dependent breast cancer, and aromatase inhibitors have been at the front line of treatment for the past three decades. The development of potent, selective and safer inhibitors is ongoing with in silico screening methods playing a more prominent role in the search for promising lead compounds in bioactivity-relevant chemical space. Here we present a set of comprehensive binding affinity prediction models for CYP19A1 using our automated Linear Interaction Energy (LIE) based workflow on a set of 132 putative and structurally diverse aromatase inhibitors obtained from a typical industrial screening study. We extended the workflow with machine learning methods to automatically cluster training and test compounds in order to maximize the number of explained compounds in one or more predictive LIE models. The method uses protein–ligand interaction profiles obtained from Molecular Dynamics (MD) trajectories to help model search and define the applicability domain of the resolved models. Our method was successful in accounting for 86% of the data set in 3 robust models that show high correlation between calculated and observed values for ligand-binding free energies (RMSE < 2.5 kJ mol–1), with good cross-validation statistics. PMID:28776988
Challenges in Small Screening Laboratories: SaaS to the rescue
Lemmon, Vance P.; Jia, Yuanyuan; Shi, Yan; Holbrook, S. Douglas; Bixby, John L; Buchser, William
2012-01-01
The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signalling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA screening of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Managing experimental workflow and library data, along with the extensive amount of experimental results is challenging. For academic laboratories generating large data sets from experiments using thousands of perturbagens, a laboratory information management system (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with a Software As A Service (SAAS) LIMS to ensure the quality of its experiments and workflows. The article discusses this application in detail, and how the system was selected and integrated into the laboratory. The advantages of SaaS are described. PMID:21631415
Long-Term Stability of Screening for Behavioral and Emotional Risk
ERIC Educational Resources Information Center
Dowdy, Erin; Nylund-Gibson, Karen; Felix, Erika D.; Morovati, Diane; Carnazzo, Katherine W.; Dever, Bridget V.
2014-01-01
The practice of screening students to identify behavioral and emotional risk is gaining momentum, with limited guidance regarding the frequency with which screenings should occur. Screening frequency decisions are influenced by the stability of the constructs assessed and changes in risk status over time. This study investigated the 4-year…
Causanilles, Ana; Kinyua, Juliet; Ruttkies, Christoph; van Nuijs, Alexander L N; Emke, Erik; Covaci, Adrian; de Voogt, Pim
2017-10-01
The inclusion of new psychoactive substances (NPS) in the wastewater-based epidemiology approach presents challenges, such as the reduced number of users that translates into low concentrations of residues and the limited pharmacokinetics information available, which renders the choice of target biomarker difficult. The sampling during special social settings, the analysis with improved analytical techniques, and data processing with specific workflow to narrow the search, are required approaches for a successful monitoring. This work presents the application of a qualitative screening technique to wastewater samples collected during a city festival, where likely users of recreational substances gather and consequently higher residual concentrations of used NPS are expected. The analysis was performed using liquid chromatography coupled to high-resolution mass spectrometry. Data were processed using an algorithm that involves the extraction of accurate masses (calculated based on molecular formula) of expected m/z from an in-house database containing about 2,000 entries, including NPS and transformation products. We positively identified eight NPS belonging to the classes of synthetic cathinones, phenethylamines and opioids. In addition, the presence of benzodiazepine analogues, classical drugs and other licit substances with potential for abuse was confirmed. The screening workflow based on a database search was useful in the identification of NPS biomarkers in wastewater. The findings highlight the specific classical drugs and low NPS use in the Netherlands. Additionally, meta-chlorophenylpiperazine (mCPP), 2,5-dimethoxy-4-bromophenethylamine (2C-B), and 4-fluoroamphetamine (FA) were identified in wastewater for the first time. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fan, Jung-Wei; Lussier, Yves A
2017-01-01
Dietary supplements remain a relatively underexplored source for drug repurposing. A systematic approach to soliciting responses from a large consumer population is desirable to speed up innovation. We tested a workflow that mines unexpected benefits of dietary supplements from massive consumer reviews. A (non-exhaustive) list of regular expressions was used to screen over 2 million reviews on health and personal care products. The matched reviews were manually analyzed, and one supplement-disease pair was linked to biological databases for enriching the hypothesized association. The regular expressions found 169 candidate reviews, of which 45.6% described unexpected benefits of certain dietary supplements. The manual analysis showed some of the supplement-disease associations to be novel or in agreement with evidence published later in the literature. The hypothesis enrichment was able to identify meaningful function similarity between the supplement and the disease. The results demonstrated value of the workflow in identifying candidates for supplement repurposing.
Klimas, Jan; Muench, John; Wiest, Katharina; Croff, Raina; Rieckman, Traci; McCarty, Dennis
2015-01-01
Problem alcohol use is associated with adverse health and economic outcomes, especially among people in opioid agonist treatment. Screening, brief intervention, and referral to treatment (SBIRT) are effective in reducing alcohol use; however, issues involved in SBIRT implementation among opioid agonist patients are unknown. To assess identification and treatment of alcohol use disorders, we reviewed clinical records of opioid agonist patients screened for an alcohol use disorder in a primary care clinic (n = 208) and in an opioid treatment program (n = 204) over a two-year period. In the primary care clinic, 193 (93%) buprenorphine patients completed an annual alcohol screening and six (3%) had elevated AUDIT scores. In the opioid treatment program, an alcohol abuse or dependence diagnosis was recorded for 54 (27%) methadone patients. Practitioner focus groups were completed in the primary care (n = 4 physicians) and the opioid treatment program (n = 11 counselors) to assess experience with and attitudes towards screening opioid agonist patients for alcohol use disorders. Focus groups suggested that organizational, structural, provider, patient, and community variables hindered or fostered alcohol screening. Alcohol screening is feasible among opioid agonist patients. Effective implementation, however, requires physician training and systematic changes in workflow.
NASA Astrophysics Data System (ADS)
Cui, Wei; Parker, Laurie L.
2016-07-01
Fluorescent drug screening assays are essential for tyrosine kinase inhibitor discovery. Here we demonstrate a flexible, antibody-free TR-LRET kinase assay strategy that is enabled by the combination of streptavidin-coated quantum dot (QD) acceptors and biotinylated, Tb3+ sensitizing peptide donors. By exploiting the spectral features of Tb3+ and QD, and the high binding affinity of the streptavidin-biotin interaction, we achieved multiplexed detection of kinase activity in a modular fashion without requiring additional covalent labeling of each peptide substrate. This strategy is compatible with high-throughput screening, and should be adaptable to the rapidly changing workflows and targets involved in kinase inhibitor discovery.
NASA Astrophysics Data System (ADS)
Wiemker, Rafael; Sevenster, Merlijn; MacMahon, Heber; Li, Feng; Dalal, Sandeep; Tahmasebi, Amir; Klinder, Tobias
2017-03-01
The imaging biomarkers EmphysemaPresence and NoduleSpiculation are crucial inputs for most models aiming to predict the risk of indeterminate pulmonary nodules detected at CT screening. To increase reproducibility and to accelerate screening workflow it is desirable to assess these biomarkers automatically. Validation on NLST images indicates that standard histogram measures are not sufficient to assess EmphysemaPresence in screenees. However, automatic scoring of bulla-resembling low attenuation areas can achieve agreement with experts with close to 80% sensitivity and specificity. NoduleSpiculation can be automatically assessed with similar accuracy. We find a dedicated spiculi tracing score to slightly outperform generic combinations of texture features with classifiers.
Johnson, Karin E; Kamineni, Aruna; Fuller, Sharon; Olmstead, Danielle; Wernli, Karen J
2014-01-01
The use of electronic health records (EHRs) for research is proceeding rapidly, driven by computational power, analytical techniques, and policy. However, EHR-based research is limited by the complexity of EHR data and a lack of understanding about data provenance, meaning the context under which the data were collected. This paper presents system flow mapping as a method to help researchers more fully understand the provenance of their EHR data as it relates to local workflow. We provide two specific examples of how this method can improve data identification, documentation, and processing. EHRs store clinical and administrative data, often in unstructured fields. Each clinical system has a unique and dynamic workflow, as well as an EHR customized for local use. The EHR customization may be influenced by a broader context such as documentation required for billing. We present a case study with two examples of using system flow mapping to characterize EHR data for a local colorectal cancer screening process. System flow mapping demonstrated that information entered into the EHR during clinical practice required interpretation and transformation before it could be accurately applied to research. We illustrate how system flow mapping shaped our knowledge of the quality and completeness of data in two examples: (1) determining colonoscopy indication as recorded in the EHR, and (2) discovering a specific EHR form that captured family history. Researchers who do not consider data provenance risk compiling data that are systematically incomplete or incorrect. For example, researchers who are not familiar with the clinical workflow under which data were entered might miss or misunderstand patient information or procedure and diagnostic codes. Data provenance is a fundamental characteristic of research data from EHRs. Given the diversity of EHR platforms and system workflows, researchers need tools for evaluating and reporting data availability, quality, and transformations. Our case study illustrates how system mapping can inform researchers about the provenance of their data as it pertains to local workflows.
MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laub, S; Dunn, M; Galbreath, G
2015-06-15
Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or greenmore » according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient communication. Having built-in timelines allows easy prioritization of tasks and resources and facilitates effective time management.« less
NASA Astrophysics Data System (ADS)
van Oosten, Luuk N.; Pieterse, Mervin; Pinkse, Martijn W. H.; Verhaert, Peter D. E. M.
2015-12-01
Animal venoms and toxins are a valuable source of bioactive peptides with pharmacologic relevance as potential drug leads. A large subset of biologically active peptides discovered up till now contain disulfide bridges that enhance stability and activity. To discover new members of this class of peptides, we developed a workflow screening specifically for those peptides that contain inter- and intra-molecular disulfide bonds by means of three-dimensional (3D) mass mapping. Two intrinsic properties of the sulfur atom, (1) its relatively large negative mass defect, and (2) its isotopic composition, allow for differentiation between cysteine-containing peptides and peptides lacking sulfur. High sulfur content in a peptide decreases the normalized nominal mass defect (NMD) and increases the normalized isotopic shift (NIS). Hence in a 3D plot of mass, NIS, and NMD, peptides with sulfur appear in this plot with a distinct spatial localization compared with peptides that lack sulfur. In this study we investigated the skin secretion of two frog species; Odorrana schmackeri and Bombina variegata. Peptides from the crude skin secretions were separated by nanoflow LC, and of all eluting peptides high resolution zoom scans were acquired in order to accurately determine both monoisotopic mass and average mass. Both the NMD and the NIS were calculated from the experimental data using an in-house developed MATLAB script. Candidate peptides exhibiting a low NMD and high NIS values were selected for targeted de novo sequencing, and this resulted in the identification of several novel inter- and intra-molecular disulfide bond containing peptides.
Brockhauser, Sandor; Svensson, Olof; Bowler, Matthew W; Nanao, Max; Gordon, Elspeth; Leal, Ricardo M F; Popov, Alexander; Gerring, Matthew; McCarthy, Andrew A; Gotz, Andy
2012-08-01
The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE.
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.
An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; ...
2018-05-07
An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less
Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling
NASA Astrophysics Data System (ADS)
Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; Lao, L. L.; Weisberg, D. B.; Belli, E. A.; Evans, T. E.; Ferraro, N. M.; Snyder, P. B.
2018-05-01
An integrated-modeling workflow has been developed for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape and various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. Finally, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.
Optimization of geothermal well trajectory in order to minimize borehole failure
NASA Astrophysics Data System (ADS)
Dahrabou, A.; Valley, B.; Ladner, F.; Guinot, F.; Meier, P.
2017-12-01
In projects based on Enhanced Geothermal System (EGS) principle, deep boreholes are drilled to low permeability rock masses. As part of the completion operations, the permeability of existing fractures in the rock mass is enhanced by injecting large volumes of water. These stimulation treatments aim at achieving enough water circulation for heat extraction at commercial rates which makes the stimulation operations critical to the project success. The accurate placement of the stimulation treatments requires well completion with effective zonal isolation, and wellbore stability is a prerequisite to all zonal isolation techniques, be it packer sealing or cement placement. In this project, a workflow allowing a fast decision-making process for selecting an optimal well trajectory for EGS projects is developed. In fact, the well is first drilled vertically then based on logging data which are costly (100 KCHF/day), the direction in which the strongly deviated borehole section will be drilled needs to be determined in order to optimize borehole stability and to intersect the highest number of fractures that are oriented favorably for stimulation. The workflow applies to crystalline rock and includes an uncertainty and risk assessment framework. An initial sensitivity study was performed to identify the most influential parameters on borehole stability. The main challenge in these analyses is that the strength and stress profiles are unknown independently. Calibration of a geomechanical model on the observed borehole failure has been performed using data from the Basel Geothermal well BS-1. In a first approximation, a purely elastic-static analytical solution in combination with a purely cohesive failure criterion were used as it provides the most consistent prediction across failure indicators. A systematic analysis of the uncertainty on all parameters was performed to assess the reliability of the optimal trajectory selection. To each drilling scenario, failure probability and the associated risks, are computed stochastically. In addition, model uncertainty is assessed by confronting various failure modelling approaches to the available failure data from the Basel Project. Together, these results form the basis of an integrated workflow optimizing geothermal (EGS) well trajectory.
Melagraki, G; Afantitis, A
2011-01-01
Virtual Screening (VS) has experienced increased attention into the recent years due to the large datasets made available, the development of advanced VS techniques and the encouraging fact that VS has contributed to the discovery of several compounds that have either reached the market or entered clinical trials. Hepatitis C Virus (HCV) nonstructural protein 5B (NS5B) has become an attractive target for the development of antiviral drugs and many small molecules have been explored as possible HCV NS5B inhibitors. In parallel with experimental practices, VS can serve as a valuable tool in the identification of novel effective inhibitors. Different techniques and workflows have been reported in literature with the goal to prioritize possible potent hits. In this context, different virtual screening strategies have been deployed for the identification of novel Hepatitis C Virus (HCV) inhibitors. This work reviews recent applications of virtual screening in an effort to identify novel potent HCV inhibitors.
Customizing G Protein-coupled receptor models for structure-based virtual screening.
de Graaf, Chris; Rognan, Didier
2009-01-01
This review will focus on the construction, refinement, and validation of G Protein-coupled receptor models for the purpose of structure-based virtual screening. Practical tips and tricks derived from concrete modeling and virtual screening exercises to overcome the problems and pitfalls associated with the different steps of the receptor modeling workflow will be presented. These examples will not only include rhodopsin-like (class A), but also secretine-like (class B), and glutamate-like (class C) receptors. In addition, the review will present a careful comparative analysis of current crystal structures and their implication on homology modeling. The following themes will be discussed: i) the use of experimental anchors in guiding the modeling procedure; ii) amino acid sequence alignments; iii) ligand binding mode accommodation and binding cavity expansion; iv) proline-induced kinks in transmembrane helices; v) binding mode prediction and virtual screening by receptor-ligand interaction fingerprint scoring; vi) extracellular loop modeling; vii) virtual filtering schemes. Finally, an overview of several successful structure-based screening shows that receptor models, despite structural inaccuracies, can be efficiently used to find novel ligands.
Browne, J L; Schielen, P C J I; Belmouden, I; Pennings, J L A; Klipstein-Grobusch, K
2015-06-01
The objectives of the article is to compare pregnancy-associated plasma protein A (PAPP-A) and free β-subunit of human chorionic gonadotropin (β-hCG) concentrations in dried blood spots (DBSs) with serum of samples obtained from a public hospital in a low-resource setting and to evaluate their stability. Serum and DBS samples were obtained by venipuncture and finger prick from 50 pregnant participants in a cohort study in a public hospital in Accra, Ghana. PAPP-A and β-hCG concentrations from serum and DBS were measured with an AutoDELFIA® (PerkinElmer, PerkinElmer, Turku, Finland) automatic immunoassay. Correlation and Passing-Bablok regression analyses were performed to compare marker levels. High correlation (>0.9) was observed for PAPP-A and β-hCG levels between various sampling techniques. The β-hCG concentration was stable between DBS and serum, PAPP-A concentration consistently lower in DBS. Our findings suggest that β-hCG can be reliably collected from DBS in low-resource tropical settings. The exact conditions of the clinical workflow necessary for reliable PAPP-A measurement in these settings need to be further developed in the future. These findings could have implications for prenatal screening programs feasibility in low-income and middle-income countries, as DBS provides an alternative minimally invasive sampling method, with advantages in sampling technique, stability, logistics, and potential application in low-resource settings. © 2015 John Wiley & Sons, Ltd.
Goldberg, Deborah S; Lewus, Rachael A; Esfandiary, Reza; Farkas, David C; Mody, Neil; Day, Katrina J; Mallik, Priyanka; Tracka, Malgorzata B; Sealey, Smita K; Samra, Hardeep S
2017-08-01
Selecting optimal formulation conditions for monoclonal antibodies for first time in human clinical trials is challenging due to short timelines and reliance on predictive assays to ensure product quality and adequate long-term stability. Accelerated stability studies are considered to be the gold standard for excipient screening, but they are relatively low throughput and time consuming. High throughput screening (HTS) techniques allow for large amounts of data to be collected quickly and easily, and can be used to screen solution conditions for early formulation development. The utility of using accelerated stability compared to HTS techniques (differential scanning light scattering and differential scanning fluorescence) for early formulation screening was evaluated along with the impact of excipients of various types on aggregation of monoclonal antibodies from multiple IgG subtypes. The excipient rank order using quantitative HTS measures was found to correlate with accelerated stability aggregation rate ranking for only 33% (by differential scanning fluorescence) to 42% (by differential scanning light scattering) of the antibodies tested, due to the high intrinsic stability and minimal impact of excipients on aggregation rates and HTS data. Also explored was a case study of employing a platform formulation instead of broader formulation screening for early formulation development. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Yeast as a tool to identify anti-aging compounds
Zimmermann, Andreas; Hofer, Sebastian; Pendl, Tobias; Kainz, Katharina; Madeo, Frank; Carmona-Gutierrez, Didac
2018-01-01
Abstract In the search for interventions against aging and age-related diseases, biological screening platforms are indispensable tools to identify anti-aging compounds among large substance libraries. The budding yeast, Saccharomyces cerevisiae, has emerged as a powerful chemical and genetic screening platform, as it combines a rapid workflow with experimental amenability and the availability of a wide range of genetic mutant libraries. Given the amount of conserved genes and aging mechanisms between yeast and human, testing candidate anti-aging substances in yeast gene-deletion or overexpression collections, or de novo derived mutants, has proven highly successful in finding potential molecular targets. Yeast-based studies, for example, have led to the discovery of the polyphenol resveratrol and the natural polyamine spermidine as potential anti-aging agents. Here, we present strategies for pharmacological anti-aging screens in yeast, discuss common pitfalls and summarize studies that have used yeast for drug discovery and target identification. PMID:29905792
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Compound management beyond efficiency.
Burr, Ian; Winchester, Toby; Keighley, Wilma; Sewing, Andreas
2009-06-01
Codeveloping alongside chemistry and in vitro screening, compound management was one of the first areas in research recognizing the need for efficient processes and workflows. Material management groups have centralized, automated, miniaturized and, importantly, found out what not to do with compounds. While driving down cost and improving quality in storage and processing, researchers still face the challenge of interfacing optimally with changing business processes, in screening groups, and with external vendors and focusing on biologicals in many companies. Here we review our strategy to provide a seamless link between compound acquisition and screening operations and the impact of material management on quality of the downstream processes. Although this is driven in part by new technologies and improved quality control within material management, redefining team structures and roles also drives job satisfaction and motivation in our teams with a subsequent positive impact on cycle times and customer feedback.
Design and Development of ChemInfoCloud: An Integrated Cloud Enabled Platform for Virtual Screening.
Karthikeyan, Muthukumarasamy; Pandit, Deepak; Bhavasar, Arvind; Vyas, Renu
2015-01-01
The power of cloud computing and distributed computing has been harnessed to handle vast and heterogeneous data required to be processed in any virtual screening protocol. A cloud computing platorm ChemInfoCloud was built and integrated with several chemoinformatics and bioinformatics tools. The robust engine performs the core chemoinformatics tasks of lead generation, lead optimisation and property prediction in a fast and efficient manner. It has also been provided with some of the bioinformatics functionalities including sequence alignment, active site pose prediction and protein ligand docking. Text mining, NMR chemical shift (1H, 13C) prediction and reaction fingerprint generation modules for efficient lead discovery are also implemented in this platform. We have developed an integrated problem solving cloud environment for virtual screening studies that also provides workflow management, better usability and interaction with end users using container based virtualization, OpenVz.
Localization-based super-resolution imaging meets high-content screening.
Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste
2017-12-01
Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.
Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott
2018-05-01
The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.
Wearable Notification via Dissemination Service in a Pervasive Computing Environment
2015-09-01
context, state, and environment in a manner that would be transparent to a Soldier’s common operations. 15. SUBJECT TERMS pervasive computing, Android ...of user context shifts, i.e., changes in the user’s position, history , workflow, or resource interests. If the PCE is described as a 2-component...convenient viewing on the Glass’s screen just above the line of sight. All of the software developed uses Google’s Android open-source software stack
Commercialization of microfluidic devices.
Volpatti, Lisa R; Yetisen, Ali K
2014-07-01
Microfluidic devices offer automation and high-throughput screening, and operate at low volumes of consumables. Although microfluidics has the potential to reduce turnaround times and costs for analytical devices, particularly in medical, veterinary, and environmental sciences, this enabling technology has had limited diffusion into consumer products. This article analyzes the microfluidics market, identifies issues, and highlights successful commercialization strategies. Addressing niche markets and establishing compatibility with existing workflows will accelerate market penetration. Copyright © 2014 Elsevier Ltd. All rights reserved.
Localized structural frustration for evaluating the impact of sequence variants
Kumar, Sushant; Clarke, Declan; Gerstein, Mark
2016-01-01
Population-scale sequencing is increasingly uncovering large numbers of rare single-nucleotide variants (SNVs) in coding regions of the genome. The rarity of these variants makes it challenging to evaluate their deleteriousness with conventional phenotype–genotype associations. Protein structures provide a way of addressing this challenge. Previous efforts have focused on globally quantifying the impact of SNVs on protein stability. However, local perturbations may severely impact protein functionality without strongly disrupting global stability (e.g. in relation to catalysis or allostery). Here, we describe a workflow in which localized frustration, quantifying unfavorable local interactions, is employed as a metric to investigate such effects. Using this workflow on the Protein Databank, we find that frustration produces many immediately intuitive results: for instance, disease-related SNVs create stronger changes in localized frustration than non-disease related variants, and rare SNVs tend to disrupt local interactions to a larger extent than common variants. Less obviously, we observe that somatic SNVs associated with oncogenes and tumor suppressor genes (TSGs) induce very different changes in frustration. In particular, those associated with TSGs change the frustration more in the core than the surface (by introducing loss-of-function events), whereas those associated with oncogenes manifest the opposite pattern, creating gain-of-function events. PMID:27915290
Droplet microfluidics for single-cell analysis.
Brouzes, Eric
2012-01-01
This book chapter aims at providing an overview of all the aspects and procedures needed to develop a droplet-based workflow for single-cell analysis (see Fig. 10.1). The surfactant system used to stabilize droplets is a critical component of droplet microfluidics; its properties define the type of droplet-based assays and workflows that can be developed. The scope of this book chapter is limited to fluorinated surfactant systems that have proved to generate extremely stable droplets and allow to easily retrieve the encapsulated material. The formulation section discusses how the experimental parameters influence the choice of the surfactant system to use. The circuit design section presents recipes to design and integrate different droplet modules into a whole assay. The fabrication section describes the manufacturing of microfluidic chip including the surface treatment which is pivotal in droplet microfluidics. Finally, the last section reviews the experimental setup for fluorescence detection with an emphasis on cell injection and incubation.
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
Case study: impact of technology investment on lead discovery at Bristol-Myers Squibb, 1998-2006.
Houston, John G; Banks, Martyn N; Binnie, Alastair; Brenner, Stephen; O'Connell, Jonathan; Petrillo, Edward W
2008-01-01
We review strategic approaches taken over an eight-year period at BMS to implement new high-throughput approaches to lead discovery. Investments in compound management infrastructure and chemistry library production capability allowed significant growth in the size, diversity and quality of the BMS compound collection. Screening platforms were upgraded with robust automated technology to support miniaturized assay formats, while workflows and information handling technologies were streamlined for improved performance. These technology changes drove the need for a supporting organization in which critical engineering, informatics and scientific skills were more strongly represented. Taken together, these investments led to significant improvements in speed and productivity as well a greater impact of screening campaigns on the initiation of new drug discovery programs.
Automated Protocol for Large-Scale Modeling of Gene Expression Data.
Hall, Michelle Lynn; Calkins, David; Sherman, Woody
2016-11-28
With the continued rise of phenotypic- and genotypic-based screening projects, computational methods to analyze, process, and ultimately make predictions in this field take on growing importance. Here we show how automated machine learning workflows can produce models that are predictive of differential gene expression as a function of a compound structure using data from A673 cells as a proof of principle. In particular, we present predictive models with an average accuracy of greater than 70% across a highly diverse ∼1000 gene expression profile. In contrast to the usual in silico design paradigm, where one interrogates a particular target-based response, this work opens the opportunity for virtual screening and lead optimization for desired multitarget gene expression profiles.
Lee, Christoph I; Lehman, Constance D
2016-11-01
Emerging imaging technologies, including digital breast tomosynthesis, have the potential to transform breast cancer screening. However, the rapid adoption of these new technologies outpaces the evidence of their clinical and cost-effectiveness. The authors describe the forces driving the rapid diffusion of tomosynthesis into clinical practice, comparing it with the rapid diffusion of digital mammography shortly after its introduction. They outline the potential positive and negative effects that adoption can have on imaging workflow and describe the practice management challenges when incorporating tomosynthesis. The authors also provide recommendations for collecting evidence supporting the development of policies and best practices. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Muench, John; Jarvis, Kelly; Boverman, Josh; Hardman, Joseph; Hayes, Meg; Winkle, Jim
2012-01-01
In order to successfully integrate screening, brief intervention, and referral to treatment (SBIRT) into primary care, education of clinicians must be paired with sustainable transformation of the clinical settings in which they practice. The SBIRT Oregon project adopted this strategy in an effort to fully integrate SBIRT into 7 primary care residency clinics. Residents were trained to assess and intervene in their patients' unhealthy substance use, whereas clinic staff personnel were trained to carry out a multistep screening process. Electronic medical record tools were created to further integrate and track SBIRT processes. This article describes how a resident training curriculum complemented and was informed by the transformation of workflow processes within the residents' home clinics.
Clinicians' Perceptions of Screening for Food Insecurity in Suburban Pediatric Practice.
Palakshappa, Deepak; Vasan, Aditi; Khan, Saba; Seifu, Leah; Feudtner, Chris; Fiks, Alexander G
2017-07-01
National organizations recommend pediatricians screen for food insecurity (FI). Although there has been growing research in urban practices, little research has addressed FI screening in suburban practices. We evaluated the feasibility, acceptability, and impact of screening in suburban practices. We conducted a mixed methods study that implemented FI screening in 6 suburban pediatric primary care practices. We included all children presenting for either a 2-, 15-, or 36-month well-child visit ( N = 5645). Families who screened positive were eligible to be referred to our community partner that worked to connect families to the Supplemental Nutrition Assistance Program. We conducted focus groups with clinicians to determine their perceptions of screening and suggestions for improvement. Of the 5645 children eligible, 4371 (77.4%) were screened, of which 122 (2.8%) screened positive for FI (range: 0.9%-5.9% across practices). Of the 122 food-insecure families, only 1 received new Supplemental Nutrition Assistance Program benefits. In focus groups, 3 themes emerged: (1) Time and workflow were not barriers to screening, but concerns about embarrassing families and being unable to provide adequate resources were; (2) Clinicians reported that parents felt the screening showed caring, which reinforced clinicians' continued screening; (3) Clinicians suggested implementing screening before the visit. We found it is feasible and acceptable for clinicians to screen for FI in suburban practices, but the referral method used in this study was ineffective in assisting families in obtaining benefits. Better approaches to connect families to local resources may be needed to maximize the effectiveness of screening in suburban settings. Copyright © 2017 by the American Academy of Pediatrics.
Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.
Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor
2016-01-01
In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.
SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog
NASA Astrophysics Data System (ADS)
Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely
2014-05-01
Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.
John, Susan D; Moore, Quentin T; Herrmann, Tracy; Don, Steven; Powers, Kevin; Smith, Susan N; Morrison, Greg; Charkot, Ellen; Mills, Thalia T; Rutz, Lois; Goske, Marilyn J
2013-10-01
Transition from film-screen to digital radiography requires changes in radiographic technique and workflow processes to ensure that the minimum radiation exposure is used while maintaining diagnostic image quality. Checklists have been demonstrated to be useful tools for decreasing errors and improving safety in several areas, including commercial aviation and surgical procedures. The Image Gently campaign, through a competitive grant from the FDA, developed a checklist for technologists to use during the performance of digital radiography in pediatric patients. The checklist outlines the critical steps in digital radiography workflow, with an emphasis on steps that affect radiation exposure and image quality. The checklist and its accompanying implementation manual and practice quality improvement project are open source and downloadable at www.imagegently.org. The authors describe the process of developing and testing the checklist and offer suggestions for using the checklist to minimize radiation exposure to children during radiography. Copyright © 2013 American College of Radiology. All rights reserved.
Latimer, Luke N; Dueber, John E
2017-06-01
A common challenge in metabolic engineering is rapidly identifying rate-controlling enzymes in heterologous pathways for subsequent production improvement. We demonstrate a workflow to address this challenge and apply it to improving xylose utilization in Saccharomyces cerevisiae. For eight reactions required for conversion of xylose to ethanol, we screened enzymes for functional expression in S. cerevisiae, followed by a combinatorial expression analysis to achieve pathway flux balancing and identification of limiting enzymatic activities. In the next round of strain engineering, we increased the copy number of these limiting enzymes and again tested the eight-enzyme combinatorial expression library in this new background. This workflow yielded a strain that has a ∼70% increase in biomass yield and ∼240% increase in xylose utilization. Finally, we chromosomally integrated the expression library. This library enriched for strains with multiple integrations of the pathway, which likely were the result of tandem integrations mediated by promoter homology. Biotechnol. Bioeng. 2017;114: 1301-1309. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Nanopore sequencing technology: a new route for the fast detection of unauthorized GMO.
Fraiture, Marie-Alice; Saltykova, Assia; Hoffman, Stefan; Winand, Raf; Deforce, Dieter; Vanneste, Kevin; De Keersmaecker, Sigrid C J; Roosens, Nancy H C
2018-05-21
In order to strengthen the current genetically modified organism (GMO) detection system for unauthorized GMO, we have recently developed a new workflow based on DNA walking to amplify unknown sequences surrounding a known DNA region. This DNA walking is performed on transgenic elements, commonly found in GMO, that were earlier detected by real-time PCR (qPCR) screening. Previously, we have demonstrated the ability of this approach to detect unauthorized GMO via the identification of unique transgene flanking regions and the unnatural associations of elements from the transgenic cassette. In the present study, we investigate the feasibility to integrate the described workflow with the MinION Next-Generation-Sequencing (NGS). The MinION sequencing platform can provide long read-lengths and deal with heterogenic DNA libraries, allowing for rapid and efficient delivery of sequences of interest. In addition, the ability of this NGS platform to characterize unauthorized and unknown GMO without any a priori knowledge has been assessed.
Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.
2011-01-01
Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088
Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W
2011-09-02
Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.
MONA – Interactive manipulation of molecule collections
2013-01-01
Working with small‐molecule datasets is a routine task for cheminformaticians and chemists. The analysis and comparison of vendor catalogues and the compilation of promising candidates as starting points for screening campaigns are but a few very common applications. The workflows applied for this purpose usually consist of multiple basic cheminformatics tasks such as checking for duplicates or filtering by physico‐chemical properties. Pipelining tools allow to create and change such workflows without much effort, but usually do not support interventions once the pipeline has been started. In many contexts, however, the best suited workflow is not known in advance, thus making it necessary to take the results of the previous steps into consideration before proceeding. To support intuition‐driven processing of compound collections, we developed MONA, an interactive tool that has been designed to prepare and visualize large small‐molecule datasets. Using an SQL database common cheminformatics tasks such as analysis and filtering can be performed interactively with various methods for visual support. Great care was taken in creating a simple, intuitive user interface which can be instantly used without any setup steps. MONA combines the interactivity of molecule database systems with the simplicity of pipelining tools, thus enabling the case‐to‐case application of chemistry expert knowledge. The current version is available free of charge for academic use and can be downloaded at http://www.zbh.uni‐hamburg.de/mona. PMID:23985157
Deep Eutectic Salt Formulations Suitable as Advanced Heat Transfer Fluids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raade, Justin; Roark, Thomas; Vaughn, John
2013-07-22
Concentrating solar power (CSP) facilities are comprised of many miles of fluid-filled pipes arranged in large grids with reflective mirrors used to capture radiation from the sun. Solar radiation heats the fluid which is used to produce steam necessary to power large electricity generation turbines. Currently, organic, oil-based fluid in the pipes has a maximum temperature threshold of 400 °C, allowing for the production of electricity at approximately 15 cents per kilowatt hour. The DOE hopes to foster the development of an advanced heat transfer fluid that can operate within higher temperature ranges. The new heat transfer fluid, when usedmore » with other advanced technologies, could significantly decrease solar electricity cost. Lower costs would make solar thermal electricity competitive with gas and coal and would offer a clean, renewable source of energy. Molten salts exhibit many desirable heat transfer qualities within the range of the project objectives. Halotechnics developed advanced heat transfer fluids (HTFs) for application in solar thermal power generation. This project focused on complex mixtures of inorganic salts that exhibited a high thermal stability, a low melting point, and other favorable characteristics. A high-throughput combinatorial research and development program was conducted in order to achieve the project objective. Over 19,000 candidate formulations were screened. The workflow developed to screen various chemical systems to discover salt formulations led to mixtures suitable for use as HTFs in both parabolic trough and heliostat CSP plants. Furthermore, salt mixtures which will not interfere with fertilizer based nitrates were discovered. In addition for use in CSP, the discovered salt mixtures can be applied to electricity storage, heat treatment of alloys and other industrial processes.« less
Agile parallel bioinformatics workflow management using Pwrake.
Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro
2011-09-08
In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows.
Agile parallel bioinformatics workflow management using Pwrake
2011-01-01
Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows. PMID:21899774
ENCoRE: an efficient software for CRISPR screens identifies new players in extrinsic apoptosis.
Trümbach, Dietrich; Pfeiffer, Susanne; Poppe, Manuel; Scherb, Hagen; Doll, Sebastian; Wurst, Wolfgang; Schick, Joel A
2017-11-25
As CRISPR/Cas9 mediated screens with pooled guide libraries in somatic cells become increasingly established, an unmet need for rapid and accurate companion informatics tools has emerged. We have developed a lightweight and efficient software to easily manipulate large raw next generation sequencing datasets derived from such screens into informative relational context with graphical support. The advantages of the software entitled ENCoRE (Easy NGS-to-Gene CRISPR REsults) include a simple graphical workflow, platform independence, local and fast multithreaded processing, data pre-processing and gene mapping with custom library import. We demonstrate the capabilities of ENCoRE to interrogate results from a pooled CRISPR cellular viability screen following Tumor Necrosis Factor-alpha challenge. The results not only identified stereotypical players in extrinsic apoptotic signaling but two as yet uncharacterized members of the extrinsic apoptotic cascade, Smg7 and Ces2a. We further validated and characterized cell lines containing mutations in these genes against a panel of cell death stimuli and involvement in p53 signaling. In summary, this software enables bench scientists with sensitive data or without access to informatic cores to rapidly interpret results from large scale experiments resulting from pooled CRISPR/Cas9 library screens.
The essential roles of chemistry in high-throughput screening triage
Dahlin, Jayme L; Walters, Michael A
2015-01-01
It is increasingly clear that academic high-throughput screening (HTS) and virtual HTS triage suffers from a lack of scientists trained in the art and science of early drug discovery chemistry. Many recent publications report the discovery of compounds by screening that are most likely artifacts or promiscuous bioactive compounds, and these results are not placed into the context of previous studies. For HTS to be most successful, it is our contention that there must exist an early partnership between biologists and medicinal chemists. Their combined skill sets are necessary to design robust assays and efficient workflows that will weed out assay artifacts, false positives, promiscuous bioactive compounds and intractable screening hits, efforts that ultimately give projects a better chance at identifying truly useful chemical matter. Expertise in medicinal chemistry, cheminformatics and purification sciences (analytical chemistry) can enhance the post-HTS triage process by quickly removing these problematic chemotypes from consideration, while simultaneously prioritizing the more promising chemical matter for follow-up testing. It is only when biologists and chemists collaborate effectively that HTS can manifest its full promise. PMID:25163000
Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M
2016-02-01
In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.
Development of a Kinetic Assay for Late Endosome Movement.
Esner, Milan; Meyenhofer, Felix; Kuhn, Michael; Thomas, Melissa; Kalaidzidis, Yannis; Bickle, Marc
2014-08-01
Automated imaging screens are performed mostly on fixed and stained samples to simplify the workflow and increase throughput. Some processes, such as the movement of cells and organelles or measuring membrane integrity and potential, can be measured only in living cells. Developing such assays to screen large compound or RNAi collections is challenging in many respects. Here, we develop a live-cell high-content assay for tracking endocytic organelles in medium throughput. We evaluate the added value of measuring kinetic parameters compared with measuring static parameters solely. We screened 2000 compounds in U-2 OS cells expressing Lamp1-GFP to label late endosomes. All hits have phenotypes in both static and kinetic parameters. However, we show that the kinetic parameters enable better discrimination of the mechanisms of action. Most of the compounds cause a decrease of motility of endosomes, but we identify several compounds that increase endosomal motility. In summary, we show that kinetic data help to better discriminate phenotypes and thereby obtain more subtle phenotypic clustering. © 2014 Society for Laboratory Automation and Screening.
The impact of missing sensor information on surgical workflow management.
Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas
2013-09-01
Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.
High volcanic seismic b-values: Real or artefacts?
NASA Astrophysics Data System (ADS)
Roberts, Nick; Bell, Andrew; Main, Ian G.
2015-04-01
The b-value of the Gutenberg-Richter distribution quantifies the relative proportion of large to small magnitude earthquakes in a catalogue, in turn related to the population of fault rupture areas and the average slip or stress drop. Accordingly the b-value is an important parameter to consider when evaluating seismic catalogues as it has the potential to provide insight into the temporal or spatial evolution of the system, such as fracture development or changes in the local stress regime. The b-value for tectonic seismicity is commonly found to be close to 1, whereas much higher b-values are frequently reported for volcanic and induced seismicity. Understanding these differences is important for understanding the processes controlling earthquake occurrence in different settings. However, it is possible that anomalously high b-values could arise from small sample sizes, under-estimated completeness magnitudes, or other poorly applied methodologies. Therefore, it is important to establish a rigorous workflow for analyzing these datasets. Here we examine the frequency-magnitude distributions of volcanic earthquake catalogues in order to determine the significance of apparently high b-values. We first derive a workflow for computing the completeness magnitude of a seismic catalogue, using synthetic catalogues of varying shape, size, and known b-value. We find the best approach involves a combination of three methods: 'Maximum Curvature', 'b-value stability', and the 'Goodness-of-Fit test'. To calculate a reliable b-value with an error ≤0.25, the maximum curvature method is preferred for a 'sharp-peaked' discrete distribution. For a catalogue with a broader peak the b-value stability method is the most reliable with the Goodness-of-Fit test being an acceptable backup if the b-value stability method fails. We apply this workflow to earthquake catalogues from El Hierro (2011-2013) and Mt Etna (1999-2013) volcanoes. In general, we find the b-value to be equal to or slightly greater than 1. However, reliable high b-values of 1.5-2.4 at El Hierro and 1.5-1.8 at Mt Etna are observed for restricted time periods. We argue that many of the almost axiomatically 'high' b-values reported in the literature for volcanic and induced seismicity may be attributable to biases introduced by the methods of inference used and/or the relatively small sample sizes often available.
Can high seismic b-values be explained solely by poorly applied methodology?
NASA Astrophysics Data System (ADS)
Roberts, Nick; Bell, Andrew; Main, Ian
2015-04-01
The b-value of the Gutenberg-Richter distribution quantifies the relative proportion of large to small magnitude earthquakes in a catalogue, in turn related to the population of fault rupture areas and the average slip or stress drop. Accordingly the b-value is an important parameter to consider when evaluating seismic catalogues as it has the potential to provide insight into the temporal or spatial evolution of the system, such as fracture development or changes in the local stress regime. The b-value for tectonic seismicity is commonly found to be close to 1, whereas much higher b-values are frequently reported for volcanic and induced seismicity. Understanding these differences is important for understanding the processes controlling earthquake occurrence in different settings. However, it is possible that anomalously high b-values could arise from small sample sizes, under-estimated completeness magnitudes, or other poorly applied methodologies. Therefore, it is important to establish a rigorous workflow for analyzing these datasets. Here we examine the frequency-magnitude distributions of volcanic earthquake catalogues in order to determine the significance of apparently high b-values. We first derive a workflow for computing the completeness magnitude of a seismic catalogue, using synthetic catalogues of varying shape, size, and known b-value. We find the best approach involves a combination of three methods: 'Maximum Curvature', 'b-value stability', and the 'Goodness-of-Fit test'. To calculate a reliable b-value with an error ≤0.25, the maximum curvature method is preferred for a 'sharp-peaked' discrete distribution. For a catalogue with a broader peak the b-value stability method is the most reliable with the Goodness-of-Fit test being an acceptable backup if the b-value stability method fails. We apply this workflow to earthquake catalogues from El Hierro (2011-2013) and Mt Etna (1999-2013) volcanoes. In general, we find the b-value to be equal to or slightly greater than 1, however, reliably high b-values are reported in both catalogues. We argue that many of the almost axiomatically 'high' b-values reported in the literature for volcanic and induced seismicity may be attributable to biases introduced by the methods of inference used and/or the relatively small sample sizes often available. This new methodology, although focused towards volcanic catalogues, is applicabale to all seismic catalogues.
NASA Astrophysics Data System (ADS)
Toher, Cormac; Oses, Corey; Plata, Jose J.; Hicks, David; Rose, Frisco; Levy, Ohad; de Jong, Maarten; Asta, Mark; Fornari, Marco; Buongiorno Nardelli, Marco; Curtarolo, Stefano
2017-06-01
Thorough characterization of the thermomechanical properties of materials requires difficult and time-consuming experiments. This severely limits the availability of data and is one of the main obstacles for the development of effective accelerated materials design strategies. The rapid screening of new potential materials requires highly integrated, sophisticated, and robust computational approaches. We tackled the challenge by developing an automated, integrated workflow with robust error-correction within the AFLOW framework which combines the newly developed "Automatic Elasticity Library" with the previously implemented GIBBS method. The first extracts the mechanical properties from automatic self-consistent stress-strain calculations, while the latter employs those mechanical properties to evaluate the thermodynamics within the Debye model. This new thermoelastic workflow is benchmarked against a set of 74 experimentally characterized systems to pinpoint a robust computational methodology for the evaluation of bulk and shear moduli, Poisson ratios, Debye temperatures, Grüneisen parameters, and thermal conductivities of a wide variety of materials. The effect of different choices of equations of state and exchange-correlation functionals is examined and the optimum combination of properties for the Leibfried-Schlömann prediction of thermal conductivity is identified, leading to improved agreement with experimental results than the GIBBS-only approach. The framework has been applied to the AFLOW.org data repositories to compute the thermoelastic properties of over 3500 unique materials. The results are now available online by using an expanded version of the REST-API described in the Appendix.
Kokemüller, H; Jehn, P; Spalthoff, S; Essig, H; Tavassol, F; Schumann, P; Andreae, A; Nolte, I; Jagodzinski, M; Gellrich, N-C
2014-02-01
The aim of this pilot study was to determine, in a new experimental model, whether complex bioartificial monoblocs of relevant size and stability can be prefabricated in a defined three-dimensional design, in which the latissimus dorsi muscle serves as a natural bioreactor and the thoracodorsal vessel tree is prepared for axial construct perfusion. Eighteen sheep were included in the study, with six animals in each of three experimental groups. Vitalization of the β-tricalcium phosphate-based constructs was performed by direct application of unmodified osteogenic material from the iliac crest (group A), in vivo application of nucleated cell concentrate (NCC) from bone marrow aspirate (group B), and in vitro cultivation of bone marrow stromal cells (BMSC) in a perfusion bioreactor system (group C). The contours of the constructs were designed digitally and transferred onto the bioartificial bone grafts using a titanium cage, which was bent over a stereolithographic model of the defined subvolume intraoperatively. At the end of the prefabrication process, only the axial vascularized constructs of group A demonstrated vital bone formation with considerable stability. In groups B and C, the applied techniques were not able to induce ectopic bone formation. The presented computer-assisted workflow allows the prefabrication of custom-made bioartificial transplants. Copyright © 2013 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Localized structural frustration for evaluating the impact of sequence variants.
Kumar, Sushant; Clarke, Declan; Gerstein, Mark
2016-12-01
Population-scale sequencing is increasingly uncovering large numbers of rare single-nucleotide variants (SNVs) in coding regions of the genome. The rarity of these variants makes it challenging to evaluate their deleteriousness with conventional phenotype-genotype associations. Protein structures provide a way of addressing this challenge. Previous efforts have focused on globally quantifying the impact of SNVs on protein stability. However, local perturbations may severely impact protein functionality without strongly disrupting global stability (e.g. in relation to catalysis or allostery). Here, we describe a workflow in which localized frustration, quantifying unfavorable local interactions, is employed as a metric to investigate such effects. Using this workflow on the Protein Databank, we find that frustration produces many immediately intuitive results: for instance, disease-related SNVs create stronger changes in localized frustration than non-disease related variants, and rare SNVs tend to disrupt local interactions to a larger extent than common variants. Less obviously, we observe that somatic SNVs associated with oncogenes and tumor suppressor genes (TSGs) induce very different changes in frustration. In particular, those associated with TSGs change the frustration more in the core than the surface (by introducing loss-of-function events), whereas those associated with oncogenes manifest the opposite pattern, creating gain-of-function events. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Exposomics research using suspect screening and non ...
High-resolution mass spectrometry (HRMS) is used for suspect screening (SSA) and non-targeted analysis (NTA) in an attempt to characterize xenobiotic chemicals in various samples broadly and efficiently. These important techniques aid characterization of the exposome, the totality of human exposures, and provide critical information on thousands of chemicals in commerce for which exposure data are lacking. The Environmental Protection Agency (EPA) SSA and NTA capabilities consist of analytical instrumentation [liquid chromatography (LC) with time of flight (TOF) and quadrupole-TOF (Q-TOF) HRMS], workflows (feature extraction, formula generation, structure prediction, spectral matching, chemical confirmation), and tools (databases; models for predicting retention time, functional use, media occurrence, and media concentration; and schemes for ranking features and chemicals). Suspect screening (SSA) and non-targeted analysis (NTA) are used to characterize xenobiotic chemicals in various samples and aid characterization of the exposome, the totality of human exposures, and provide critical information on thousands of chemicals in commerce for which exposure data are lacking.
Thielmann, Yvonne; Koepke, Juergen; Michel, Hartmut
2012-06-01
Structure determination of membrane proteins and membrane protein complexes is still a very challenging field. To facilitate the work on membrane proteins the Core Centre follows a strategy that comprises four labs of protein analytics and crystal handling, covering mass spectrometry, calorimetry, crystallization and X-ray diffraction. This general workflow is presented and a capacity of 20% of the operating time of all systems is provided to the European structural biology community within the ESFRI Instruct program. A description of the crystallization service offered at the Core Centre is given with detailed information on screening strategy, screens used and changes to adapt high throughput for membrane proteins. Our aim is to constantly develop the Core Centre towards the usage of more efficient methods. This strategy might also include the ability to automate all steps from crystallization trials to crystal screening; here we look ahead how this aim might be realized at the Core Centre.
The complete digital workflow in fixed prosthodontics: a systematic review.
Joda, Tim; Zarone, Fernando; Ferrari, Marco
2017-09-19
The continuous development in dental processing ensures new opportunities in the field of fixed prosthodontics in a complete virtual environment without any physical model situations. The aim was to compare fully digitalized workflows to conventional and/or mixed analog-digital workflows for the treatment with tooth-borne or implant-supported fixed reconstructions. A PICO strategy was executed using an electronic (MEDLINE, EMBASE, Google Scholar) plus manual search up to 2016-09-16 focusing on RCTs investigating complete digital workflows in fixed prosthodontics with regard to economics or esthetics or patient-centered outcomes with or without follow-up or survival/success rate analysis as well as complication assessment of at least 1 year under function. The search strategy was assembled from MeSH-Terms and unspecific free-text words: {(("Dental Prosthesis" [MeSH]) OR ("Crowns" [MeSH]) OR ("Dental Prosthesis, Implant-Supported" [MeSH])) OR ((crown) OR (fixed dental prosthesis) OR (fixed reconstruction) OR (dental bridge) OR (implant crown) OR (implant prosthesis) OR (implant restoration) OR (implant reconstruction))} AND {("Computer-Aided Design" [MeSH]) OR ((digital workflow) OR (digital technology) OR (computerized dentistry) OR (intraoral scan) OR (digital impression) OR (scanbody) OR (virtual design) OR (digital design) OR (cad/cam) OR (rapid prototyping) OR (monolithic) OR (full-contour))} AND {("Dental Technology" [MeSH) OR ((conventional workflow) OR (lost-wax-technique) OR (porcelain-fused-to-metal) OR (PFM) OR (implant impression) OR (hand-layering) OR (veneering) OR (framework))} AND {(("Study, Feasibility" [MeSH]) OR ("Survival" [MeSH]) OR ("Success" [MeSH]) OR ("Economics" [MeSH]) OR ("Costs, Cost Analysis" [MeSH]) OR ("Esthetics, Dental" [MeSH]) OR ("Patient Satisfaction" [MeSH])) OR ((feasibility) OR (efficiency) OR (patient-centered outcome))}. Assessment of risk of bias in selected studies was done at a 'trial level' including random sequence generation, allocation concealment, blinding, completeness of outcome data, selective reporting, and other bias using the Cochrane Collaboration tool. A judgment of risk of bias was assigned if one or more key domains had a high or unclear risk of bias. An official registration of the systematic review was not performed. The systematic search identified 67 titles, 32 abstracts thereof were screened, and subsequently, three full-texts included for data extraction. Analysed RCTs were heterogeneous without follow-up. One study demonstrated that fully digitally produced dental crowns revealed the feasibility of the process itself; however, the marginal precision was lower for lithium disilicate (LS2) restorations (113.8 μm) compared to conventional metal-ceramic (92.4 μm) and zirconium dioxide (ZrO2) crowns (68.5 μm) (p < 0.05). Another study showed that leucite-reinforced glass ceramic crowns were esthetically favoured by the patients (8/2 crowns) and clinicians (7/3 crowns) (p < 0.05). The third study investigated implant crowns. The complete digital workflow was more than twofold faster (75.3 min) in comparison to the mixed analog-digital workflow (156.6 min) (p < 0.05). No RCTs could be found investigating multi-unit fixed dental prostheses (FDP). The number of RCTs testing complete digital workflows in fixed prosthodontics is low. Scientifically proven recommendations for clinical routine cannot be given at this time. Research with high-quality trials seems to be slower than the industrial progress of available digital applications. Future research with well-designed RCTs including follow-up observation is compellingly necessary in the field of complete digital processing.
Flexible workflow sharing and execution services for e-scientists
NASA Astrophysics Data System (ADS)
Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely
2013-04-01
The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.
Bou, Gerelchimeg; Sun, Mingju; Lv, Ming; Zhu, Jiang; Li, Hui; Wang, Juan; Li, Lu; Liu, Zhongfeng; Zheng, Zhong; He, Wenteng; Kong, Qingran; Liu, Zhonghua
2014-08-01
For efficient transgenic herd expansion, only the transgenic animals that possess the ability to transmit transgene into next generation are considered for breeding. However, for transgenic pig, practically lacking a pre-breeding screening program, time, labor and money is always wasted to maintain non-transgenic pigs, low or null transgenic transmission pigs and the related fruitless gestations. Developing a pre-breeding screening program would make the transgenic herd expansion more economical and efficient. In this technical report, we proposed a three-step pre-breeding screening program for transgenic boars simply through combining the fluorescence in situ hybridization (FISH) assay with the common pre-breeding screening workflow. In the first step of screening, combined with general transgenic phenotype analysis, FISH is used to identify transgenic boars. In the second step of screening, combined with conventional semen test, FISH is used to detect transgenic sperm, thus to identify the individuals producing high quality semen and transgenic sperm. In the third step of screening, FISH is used to assess the in vitro fertilization embryos, thus finally to identify the individuals with the ability to produce transgenic embryos. By this three-step screening, the non-transgenic boars and boars with no ability to produce transgenic sperm or transgenic embryos would be eliminated; therefore only those boars could produce transgenic offspring are maintained and used for breeding and herd expansion. It is the first time a systematic pre-breeding screening program is proposed for transgenic pigs. This program might also be applied in other transgenic large animals, and provide an economical and efficient strategy for herd expansion.
Accelerating materials discovery through the development of polymer databases
NASA Astrophysics Data System (ADS)
Audus, Debra
In our line of business we create chemical solutions for a wide range of applications, such as home and personal care, printing and packaging, automotive and structural coatings, and structural plastics and foams applications. In this environment, stable and highly automated workflows suitable to handle complex systems are a must. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved by combining modeling and experimental approaches. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US administration. From our experience, we know, that valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work closely together. In my presentation I intend to review approaches to build and parameterize soft matter systems. As an example of our standard workflow, I will show a few applications, which include the design of a stabilizer molecule for dispersing polymer particles and the simulation of polystyrene dispersions.
Jensen, Roxanne E; Rothrock, Nan E; DeWitt, Esi M; Spiegel, Brennan; Tucker, Carole A; Crane, Heidi M; Forrest, Christopher B; Patrick, Donald L; Fredericksen, Rob; Shulman, Lisa M; Cella, David; Crane, Paul K
2015-02-01
Patient-reported outcomes (PROs) are gaining recognition as key measures for improving the quality of patient care in clinical care settings. Three factors have made the implementation of PROs in clinical care more feasible: increased use of modern measurement methods in PRO design and validation, rapid progression of technology (eg, touchscreen tablets, Internet accessibility, and electronic health records), and greater demand for measurement and monitoring of PROs by regulators, payers, accreditors, and professional organizations. As electronic PRO collection and reporting capabilities have improved, the challenges of collecting PRO data have changed. To update information on PRO adoption considerations in clinical care, highlighting electronic and technical advances with respect to measure selection, clinical workflow, data infrastructure, and outcomes reporting. Five practical case studies across diverse health care settings and patient populations are used to explore how implementation barriers were addressed to promote the successful integration of PRO collection into the clinical workflow. The case studies address selecting and reporting of relevant content, workflow integration, previsit screening, effective evaluation, and electronic health record integration. These case studies exemplify elements of well-designed electronic systems, including response automation, tailoring of item selection and reporting algorithms, flexibility of collection location, and integration with patient health care data elements. They also highlight emerging logistical barriers in this area, such as the need for specialized technological and methodological expertise, and design limitations of current electronic data capture systems.
Egner, John M; Jensen, Davin R; Olp, Michael D; Kennedy, Nolan W; Volkman, Brian F; Peterson, Francis C; Smith, Brian C; Hill, R Blake
2018-03-02
An academic chemical screening approach was developed by using 2D protein-detected NMR, and a 352-chemical fragment library was screened against three different protein targets. The approach was optimized against two protein targets with known ligands: CXCL12 and BRD4. Principal component analysis reliably identified compounds that induced nonspecific NMR crosspeak broadening but did not unambiguously identify ligands with specific affinity (hits). For improved hit detection, a novel scoring metric-difference intensity analysis (DIA)-was devised that sums all positive and negative intensities from 2D difference spectra. Applying DIA quickly discriminated potential ligands from compounds inducing nonspecific NMR crosspeak broadening and other nonspecific effects. Subsequent NMR titrations validated chemotypes important for binding to CXCL12 and BRD4. A novel target, mitochondrial fission protein Fis1, was screened, and six hits were identified by using DIA. Screening these diverse protein targets identified quinones and catechols that induced nonspecific NMR crosspeak broadening, hampering NMR analyses, but are currently not computationally identified as pan-assay interference compounds. The results established a streamlined screening workflow that can easily be scaled and adapted as part of a larger screening pipeline to identify fragment hits and assess relative binding affinities in the range of 0.3-1.6 mm. DIA could prove useful in library screening and other applications in which NMR chemical shift perturbations are measured. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.
2011-07-04
A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less
Attitude stabilization of a spacecraft equipped with large electrostatic protection screens
NASA Astrophysics Data System (ADS)
Nikitin, D. Yu.; Tikhonov, A. A.
2018-05-01
A satellite with a system of three electrostatic radiation protection (ERP) screens is under consideration. The screens are constructed as electrostatically charged toroidal shields with characteristic size of order equal to 100 m. The interaction of electric charge with the Earth's magnetic field (EMF) give rise to the Lorentz torque acting upon a satellite attitude motion. As the sizes of ERP system are large, we derive the Lorentz torque taking into account the complex form of ERP screens and gradient of the EMF in the screen volume. It is assumed that the satellite center of charge coincides with the satellite mass center. The EMF is modeled by the straight magnetic dipole. In the paper we investigate the usage of Lorentz torque for passive attitude stabilization for satellite in a circular equatorial orbit. Mathematical model for attitude dynamics of a satellite equipped with ERP interacting with the EMF is derived and first integral of corresponding differential equations is constructed. The straight equilibrium position of the satellite in the orbital frame is found. Sufficient conditions for stability of satellite equilibrium position are constructed with the use of the first integral. The gravity gradient torque is taken into account. The satellite equilibrium stability domain is constructed.
DEWEY: the DICOM-enabled workflow engine system.
Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L
2014-06-01
Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.
Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.
Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa
2012-05-04
Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org.
Ausar, Salvador F; Chan, Judy; Hoque, Warda; James, Olive; Jayasundara, Kavisha; Harper, Kevin
2011-02-01
High throughput screening (HTS) of excipients for proteins in solution can be achieved by several analytical techniques. The screening of stabilizers for proteins adsorbed onto adjuvants, however, may be difficult due to the limited amount of techniques that can measure stability of adsorbed protein in high throughput mode. Here, we demonstrate that extrinsic fluorescence spectroscopy can be successfully applied to study the physical stability of adsorbed antigens at low concentrations in 96-well plates, using a real-time polymerase chain reaction (RT-PCR) instrument. HTS was performed on three adjuvanted pneumococcal proteins as model antigens in the presence of a standard library of stabilizers. Aluminum hydroxide appeared to decrease the stability of all three proteins at relatively high and low pH values, showing a bell-shaped curve as the pH was increased from 5 to 9 with a maximum stability at near neutral pH. Nonspecific stabilizers such as mono- and disaccharides could increase the conformational stability of the antigens. In addition, those excipients that increased the melting temperature of adsorbed antigens could improve antigenicity and chemical stability. To the best of our knowledge, this is the first report describing an HTS technology amenable for low concentration of antigens adsorbed onto aluminum-containing adjuvants. Copyright © 2010 Wiley-Liss, Inc.
Identifying chromatin readers using a SILAC-based histone peptide pull-down approach.
Vermeulen, Michiel
2012-01-01
Posttranslational modifications (PTMs) on core histones regulate essential processes inside the nucleus such as transcription, replication, and DNA repair. An important function of histone PTMs is the recruitment or stabilization of chromatin-modifying proteins, which are also called chromatin "readers." We have developed a generic SILAC-based peptide pull-down approach to identify such readers for histone PTMs in an unbiased manner. In this chapter, the workflow behind this method will be presented in detail. Copyright © 2012 Elsevier Inc. All rights reserved.
Johnson, Caleb D; Whitehead, Paul N; Pletcher, Erin R; Faherty, Mallory S; Lovalekar, Mita T; Eagle, Shawn R; Keenan, Karen A
2018-04-01
Johnson, CD, Whitehead, PN, Pletcher, ER, Faherty, MS, Lovalekar, MT, Eagle, SR, and Keenan, KA. The relationship of core strength and activation and performance on three functional movement screens. J Strength Cond Res 32(4): 1166-1173, 2018-Current measures of core stability used by clinicians and researchers suffer from several shortcomings. Three functional movement screens appear, at face-value, to be dependent on the ability to activate and control core musculature. These 3 screens may present a viable alternative to current measures of core stability. Thirty-nine subjects completed a deep squat, trunk stability push-up, and rotary stability screen. Scores on the 3 screens were summed to calculate a composite score (COMP). During the screens, muscle activity was collected to determine the length of time that the bilateral erector spinae, rectus abdominis, external oblique, and gluteus medius muscles were active. Strength was assessed for core muscles (trunk flexion and extension, trunk rotation, and hip abduction and adduction) and accessory muscles (knee flexion and extension and pectoralis major). Two ordinal logistic regression equations were calculated with COMP as the outcome variable, and: (a) core strength and accessory strength, (b) only core strength. The first model was significant in predicting COMP (p = 0.004) (Pearson's Chi-Square = 149.132, p = 0.435; Nagelkerke's R-Squared = 0.369). The second model was significant in predicting COMP (p = 0.001) (Pearson's Chi-Square = 148.837, p = 0.488; Nagelkerke's R-Squared = 0.362). The core muscles were found to be active for most screens, with percentages of "time active" for each muscle ranging from 54-86%. In conclusion, performance on the 3 screens is predicted by core strength, even when accounting for "accessory" strength variables. Furthermore, it seems the screens elicit wide-ranging activation of core muscles. Although more investigation is needed, these screens, collectively, seem to be a good assessment of core strength.
Integrating text mining into the MGI biocuration workflow
Dowell, K.G.; McAndrews-Hill, M.S.; Hill, D.P.; Drabkin, H.J.; Blake, J.A.
2009-01-01
A major challenge for functional and comparative genomics resource development is the extraction of data from the biomedical literature. Although text mining for biological data is an active research field, few applications have been integrated into production literature curation systems such as those of the model organism databases (MODs). Not only are most available biological natural language (bioNLP) and information retrieval and extraction solutions difficult to adapt to existing MOD curation workflows, but many also have high error rates or are unable to process documents available in those formats preferred by scientific journals. In September 2008, Mouse Genome Informatics (MGI) at The Jackson Laboratory initiated a search for dictionary-based text mining tools that we could integrate into our biocuration workflow. MGI has rigorous document triage and annotation procedures designed to identify appropriate articles about mouse genetics and genome biology. We currently screen ∼1000 journal articles a month for Gene Ontology terms, gene mapping, gene expression, phenotype data and other key biological information. Although we do not foresee that curation tasks will ever be fully automated, we are eager to implement named entity recognition (NER) tools for gene tagging that can help streamline our curation workflow and simplify gene indexing tasks within the MGI system. Gene indexing is an MGI-specific curation function that involves identifying which mouse genes are being studied in an article, then associating the appropriate gene symbols with the article reference number in the MGI database. Here, we discuss our search process, performance metrics and success criteria, and how we identified a short list of potential text mining tools for further evaluation. We provide an overview of our pilot projects with NCBO's Open Biomedical Annotator and Fraunhofer SCAI's ProMiner. In doing so, we prove the potential for the further incorporation of semi-automated processes into the curation of the biomedical literature. PMID:20157492
Integrating text mining into the MGI biocuration workflow.
Dowell, K G; McAndrews-Hill, M S; Hill, D P; Drabkin, H J; Blake, J A
2009-01-01
A major challenge for functional and comparative genomics resource development is the extraction of data from the biomedical literature. Although text mining for biological data is an active research field, few applications have been integrated into production literature curation systems such as those of the model organism databases (MODs). Not only are most available biological natural language (bioNLP) and information retrieval and extraction solutions difficult to adapt to existing MOD curation workflows, but many also have high error rates or are unable to process documents available in those formats preferred by scientific journals.In September 2008, Mouse Genome Informatics (MGI) at The Jackson Laboratory initiated a search for dictionary-based text mining tools that we could integrate into our biocuration workflow. MGI has rigorous document triage and annotation procedures designed to identify appropriate articles about mouse genetics and genome biology. We currently screen approximately 1000 journal articles a month for Gene Ontology terms, gene mapping, gene expression, phenotype data and other key biological information. Although we do not foresee that curation tasks will ever be fully automated, we are eager to implement named entity recognition (NER) tools for gene tagging that can help streamline our curation workflow and simplify gene indexing tasks within the MGI system. Gene indexing is an MGI-specific curation function that involves identifying which mouse genes are being studied in an article, then associating the appropriate gene symbols with the article reference number in the MGI database.Here, we discuss our search process, performance metrics and success criteria, and how we identified a short list of potential text mining tools for further evaluation. We provide an overview of our pilot projects with NCBO's Open Biomedical Annotator and Fraunhofer SCAI's ProMiner. In doing so, we prove the potential for the further incorporation of semi-automated processes into the curation of the biomedical literature.
Lowres, Nicole; Krass, Ines; Neubeck, Lis; Redfern, Julie; McLachlan, Andrew J; Bennett, Alexandra A; Freedman, S Ben
2015-12-01
Atrial fibrillation guidelines advocate screening to identify undiagnosed atrial fibrillation. Community pharmacies may provide an opportunistic venue for such screening. To explore the experience of implementing an atrial fibrillation screening service from the pharmacist's perspective including: the process of study implementation; the perceived benefits; the barriers and enablers; and the challenges for future sustainability of atrial fibrillation screening within pharmacies. Setting Interviews were conducted face-to-face in the pharmacy or via telephone, according to pharmacist preference. The 'SEARCH-AF study' screened 1000 pharmacy customers aged ≥65 years using an iPhone electrocardiogram, identifying 1.5 % with undiagnosed atrial fibrillation. Nine pharmacists took part in semi-structured interviews. Interviews were transcribed in full and thematically analysed. Qualitative analysis of the experience of implementing an AF screening service from the pharmacist's perspective. Four broad themes relating to service provision were identified: (1) interest and engagement in atrial fibrillation screening by pharmacists, customers, and doctors with the novel, easy-to-use electrocardiogram technology serving as an incentive to undergo screening and an education tool for pharmacists to use with customers; (2) perceived benefits to the pharmacist including increased job satisfaction, improvement in customer relations and pharmacy profile by fostering enhanced customer care and the educational role of pharmacists; (3) implementation barriers including managing workflow, and enablers such as personal approaches for recruitment, and allocating time to discuss screening process and fears; and, (4) potential for sustainable future implementation including remuneration linked to government or pharmacy incentives, combined cardiovascular screening, and automating sections of risk-assessments using touch-screen technology. Atrial fibrillation screening in pharmacies is well accepted by pharmacists and customers. Many pharmacists combined atrial fibrillation screening with other health screens reporting improved time-efficiency and greater customer satisfaction. Widespread implementation of atrial fibrillation screening requires longterm funding, which could be provided for a combined cardiovascular screening service. Further research could focus on feasibility and cost-effectiveness of combined cardiovascular screening in pharmacies.
From days to hours: reporting clinically actionable variants from whole genome sequencing.
Middha, Sumit; Baheti, Saurabh; Hart, Steven N; Kocher, Jean-Pierre A
2014-01-01
As the cost of whole genome sequencing (WGS) decreases, clinical laboratories will be looking at broadly adopting this technology to screen for variants of clinical significance. To fully leverage this technology in a clinical setting, results need to be reported quickly, as the turnaround rate could potentially impact patient care. The latest sequencers can sequence a whole human genome in about 24 hours. However, depending on the computing infrastructure available, the processing of data can take several days, with the majority of computing time devoted to aligning reads to genomics regions that are to date not clinically interpretable. In an attempt to accelerate the reporting of clinically actionable variants, we have investigated the utility of a multi-step alignment algorithm focused on aligning reads and calling variants in genomic regions of clinical relevance prior to processing the remaining reads on the whole genome. This iterative workflow significantly accelerates the reporting of clinically actionable variants with no loss of accuracy when compared to genotypes obtained with the OMNI SNP platform or to variants detected with a standard workflow that combines Novoalign and GATK.
Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization
Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley
2015-01-01
Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
Bioinformatics workflows and web services in systems biology made easy for experimentalists.
Jimenez, Rafael C; Corpas, Manuel
2013-01-01
Workflows are useful to perform data analysis and integration in systems biology. Workflow management systems can help users create workflows without any previous knowledge in programming and web services. However the computational skills required to build such workflows are usually above the level most biological experimentalists are comfortable with. In this chapter we introduce workflow management systems that reuse existing workflows instead of creating them, making it easier for experimentalists to perform computational tasks.
When drug discovery meets web search: Learning to Rank for ligand-based virtual screening.
Zhang, Wei; Ji, Lijuan; Chen, Yanan; Tang, Kailin; Wang, Haiping; Zhu, Ruixin; Jia, Wei; Cao, Zhiwei; Liu, Qi
2015-01-01
The rapid increase in the emergence of novel chemical substances presents a substantial demands for more sophisticated computational methodologies for drug discovery. In this study, the idea of Learning to Rank in web search was presented in drug virtual screening, which has the following unique capabilities of 1). Applicable of identifying compounds on novel targets when there is not enough training data available for these targets, and 2). Integration of heterogeneous data when compound affinities are measured in different platforms. A standard pipeline was designed to carry out Learning to Rank in virtual screening. Six Learning to Rank algorithms were investigated based on two public datasets collected from Binding Database and the newly-published Community Structure-Activity Resource benchmark dataset. The results have demonstrated that Learning to rank is an efficient computational strategy for drug virtual screening, particularly due to its novel use in cross-target virtual screening and heterogeneous data integration. To the best of our knowledge, we have introduced here the first application of Learning to Rank in virtual screening. The experiment workflow and algorithm assessment designed in this study will provide a standard protocol for other similar studies. All the datasets as well as the implementations of Learning to Rank algorithms are available at http://www.tongji.edu.cn/~qiliu/lor_vs.html. Graphical AbstractThe analogy between web search and ligand-based drug discovery.
Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support
2012-01-01
Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org. PMID:22559942
Enzyme reversal to explore the function of yeast E3 ubiquitin-ligases.
MacDonald, Chris; Winistorfer, Stanley; Pope, Robert M; Wright, Michael E; Piper, Robert C
2017-07-01
The covalent attachment of ubiquitin onto proteins can elicit a variety of downstream consequences. Attachment is mediated by a large array of E3 ubiquitin ligases, each thought be subject to regulatory control and to have a specific repertoire of substrates. Assessing the biological roles of ligases, and in particular, identifying their biologically relevant substrates has been a persistent yet challenging question. In this study, we describe tools that may help achieve both of these goals. We describe a strategy whereby the activity of a ubiquitin ligase has been enzymatically reversed, accomplished by fusing it to a catalytic domain of an exogenous deubiquitinating enzyme. We present a library of 72 "anti-ligases" that appear to work in a dominant-negative fashion to stabilize their cognate substrates against ubiquitin-dependent proteasomal and lysosomal degradation. We then used the ligase-deubiquitinating enzyme (DUb) library to screen for E3 ligases involved in post-Golgi/endosomal trafficking. We identify ligases previously implicated in these pathways (Rsp5 and Tul1), in addition to ligases previously localized to endosomes (Pib1 and Vps8). We also document an optimized workflow for isolating and analyzing the "ubiquitome" of yeast, which can be used with mass spectrometry to identify substrates perturbed by expression of particular ligase-DUb fusions. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Kwf-Grid workflow management system for Earth science applications
NASA Astrophysics Data System (ADS)
Tran, V.; Hluchy, L.
2009-04-01
In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.
A mix-and-read drop-based in vitro two-hybrid method for screening high-affinity peptide binders
Cui, Naiwen; Zhang, Huidan; Schneider, Nils; Tao, Ye; Asahara, Haruichi; Sun, Zhiyi; Cai, Yamei; Koehler, Stephan A.; de Greef, Tom F. A.; Abbaspourrad, Alireza; Weitz, David A.; Chong, Shaorong
2016-01-01
Drop-based microfluidics have recently become a novel tool by providing a stable linkage between phenotype and genotype for high throughput screening. However, use of drop-based microfluidics for screening high-affinity peptide binders has not been demonstrated due to the lack of a sensitive functional assay that can detect single DNA molecules in drops. To address this sensitivity issue, we introduced in vitro two-hybrid system (IVT2H) into microfluidic drops and developed a streamlined mix-and-read drop-IVT2H method to screen a random DNA library. Drop-IVT2H was based on the correlation between the binding affinity of two interacting protein domains and transcriptional activation of a fluorescent reporter. A DNA library encoding potential peptide binders was encapsulated with IVT2H such that single DNA molecules were distributed in individual drops. We validated drop-IVT2H by screening a three-random-residue library derived from a high-affinity MDM2 inhibitor PMI. The current drop-IVT2H platform is ideally suited for affinity screening of small-to-medium-sized libraries (103–106). It can obtain hits within a single day while consuming minimal amounts of reagents. Drop-IVT2H simplifies and accelerates the drop-based microfluidics workflow for screening random DNA libraries, and represents a novel alternative method for protein engineering and in vitro directed protein evolution. PMID:26940078
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copps, Kevin D.
The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data managementmore » coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.« less
Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J
2012-01-01
Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.
Byrne, Thomas; Fargo, Jamison D; Montgomery, Ann Elizabeth; Roberts, Christopher B; Culhane, Dennis P; Kane, Vincent
2015-01-01
This study examined veterans' responses to the Veterans Health Administration's (VHA's) universal screen for homelessness and risk of homelessness during the first 12 months of implementation. We calculated the baseline annual frequency of homelessness and risk of homelessness among all veterans who completed an initial screen during the study period. We measured changes in housing status among veterans who initially screened positive and then completed a follow-up screen, assessed factors associated with such changes, and identified distinct risk profiles of veterans who completed a follow-up screen. More than 4 million veterans completed an initial screen; 1.8% (n=77,621) screened positive for homelessness or risk of homelessness. Of those who initially screened positive for either homelessness or risk of homelessness and who completed a second screen during the study period, 85.0% (n=15,060) resolved their housing instability prior to their second screen. Age, sex, race, VHA eligibility, and screening location were all associated with changes in housing stability. We identified four distinct risk profiles for veterans with ongoing housing instability. To address homelessness among veterans, efforts should include increased and targeted engagement of veterans experiencing persistent housing instability.
Optimization of protein buffer cocktails using Thermofluor.
Reinhard, Linda; Mayerhofer, Hubert; Geerlof, Arie; Mueller-Dieckmann, Jochen; Weiss, Manfred S
2013-02-01
The stability and homogeneity of a protein sample is strongly influenced by the composition of the buffer that the protein is in. A quick and easy approach to identify a buffer composition which increases the stability and possibly the conformational homogeneity of a protein sample is the fluorescence-based thermal-shift assay (Thermofluor). Here, a novel 96-condition screen for Thermofluor experiments is presented which consists of buffer and additive parts. The buffer screen comprises 23 different buffers and the additive screen includes small-molecule additives such as salts and nucleotide analogues. The utilization of small-molecule components which increase the thermal stability of a protein sample frequently results in a protein preparation of higher quality and quantity and ultimately also increases the chances of the protein crystallizing.
RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service
NASA Astrophysics Data System (ADS)
Yang, Chao; Chen, Nengcheng; Di, Liping
2012-10-01
Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.
Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludascher, Bertram; Altintas, Ilkay
Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less
Volani, Chiara; Caprioli, Giulia; Calderisi, Giovanni; Sigurdsson, Baldur B; Rainer, Johannes; Gentilini, Ivo; Hicks, Andrew A; Pramstaller, Peter P; Weiss, Guenter; Smarason, Sigurdur V; Paglia, Giuseppe
2017-10-01
Volumetric absorptive microsampling (VAMS) is a novel approach that allows single-drop (10 μL) blood collection. Integration of VAMS with mass spectrometry (MS)-based untargeted metabolomics is an attractive solution for both human and animal studies. However, to boost the use of VAMS in metabolomics, key pre-analytical questions need to be addressed. Therefore, in this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. We first evaluated the best extraction procedure for the polar metabolome and found that the highest number and amount of metabolites were recovered upon extraction with acetonitrile/water (70:30). In contrast, basic conditions (pH 9) resulted in divergent metabolite profiles mainly resulting from the extraction of intracellular metabolites originating from red blood cells. In addition, the prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but once the VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months. The time used for drying the sample did also affect the metabolome. In fact, some metabolites were rapidly degraded or accumulated in the sample during the first 48 h at room temperature, indicating that a longer drying step will significantly change the concentration in the sample. Graphical abstract Volumetric absorptive microsampling (VAMS) is a novel technology that allows single-drop blood collection and, in combination with mass spectrometry (MS)-based untargeted metabolomics, represents an attractive solution for both human and animal studies. In this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. The latter revealed that prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but if VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months.
Collins, Adam; Huett, Alan
2018-05-15
We present a high-content screen (HCS) for the simultaneous analysis of multiple phenotypes in HeLa cells expressing an autophagy reporter (mcherry-LC3) and one of 224 GFP-fused proteins from the Crohn's Disease (CD)-associated bacterium, Adherent Invasive E. coli (AIEC) strain LF82. Using automated confocal microscopy and image analysis (CellProfiler), we localised GFP fusions within cells, and monitored their effects upon autophagy (an important innate cellular defence mechanism), cellular and nuclear morphology, and the actin cytoskeleton. This data will provide an atlas for the localisation of 224 AIEC proteins within human cells, as well as a dataset to analyse their effects upon many aspects of host cell morphology. We also describe an open-source, automated, image-analysis workflow to identify bacterial effectors and their roles via the perturbations induced in reporter cell lines when candidate effectors are exogenously expressed.
Broad-Spectrum Molecular Detection of Fungal Nucleic Acids by PCR-Based Amplification Techniques.
Czurda, Stefan; Lion, Thomas
2017-01-01
Over the past decade, the incidence of life-threatening invasive fungal infections has dramatically increased. Infections caused by hitherto rare and emerging fungal pathogens are associated with significant morbidity and mortality among immunocompromised patients. These observations render the coverage of a broad range of clinically relevant fungal pathogens highly important. The so-called panfungal or, perhaps more correctly, broad-range nucleic acid amplification techniques do not only facilitate sensitive detection of all clinically relevant fungal species but are also rapid and can be applied to analyses of any patient specimens. They have therefore become valuable diagnostic tools for sensitive screening of patients at risk of invasive fungal infections. This chapter summarizes the currently available molecular technologies employed in testing of a wide range of fungal pathogens, and provides a detailed workflow for patient screening by broad-spectrum nucleic acid amplification techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, P
Purpose: To determine causal factors related to high frame definition error when treating GK patients using a pre-planning workflow. Methods: 160 cases were retrospectively reviewed. All patients received treatment using a pre-planning workflow whereby stereotactic coordinates are determined from a CT scan acquired after framing using a fiducial box. The planning software automatically detects the fiducials and compares their location to expected values based on the rigid design of the fiducial system. Any difference is reported as mean and maximum frame definition error. The manufacturer recommends these values be less than 1.0 mm and 1.5 mm. In this study, framemore » definition error was analyzed in comparison with a variety of factors including which neurosurgeon/oncologist/physicist was involved with the procedure, number of post used during framing (3 or 4), type of lesion, and which CT scanner was utilized for acquisition. An analysis of variance (ANOVA) approach was used to statistically evaluate the data and determine causal factors related to instances of high frame definition error. Results: Two factors were identified as significant: number of post (p=0.0003) and CT scanner (p=0.0001). Further analysis showed that one of the four scanners was significantly different than the others. This diagnostic scanner was identified as an older model with localization lasers not tightly calibrated. The average value for maximum frame definition error using this scanner was 1.48 mm (4 posts) and 1.75 mm (3 posts). For the other scanners this value was 1.13 mm (4 posts) and 1.40 mm (3 posts). Conclusion: In utilizing a pre-planning workflow the choice of CT scanner matters. Any scanner utilized for GK should undergo routine QA at a level appropriate for radiation oncology. In terms of 3 vs 4 post, it is hypothesized that three posts provide less stability during CT acquisition. This will be tested in future work.« less
Chatterjee, Arindam; Doerksen, Robert J.; Khan, Ikhlas A.
2014-01-01
Calpain mediated cleavage of CDK5 natural precursor p35 causes a stable complex formation of CDK5/p25, which leads to hyperphosphorylation of tau. Thus inhibition of this complex is a viable target for numerous acute and chronic neurodegenerative diseases involving tau protein, including Alzheimer’s disease. Since CDK5 has the highest sequence homology with its mitotic counterpart CDK2, our primary goal was to design selective CDK5/p25 inhibitors targeting neurodegeneration. A novel structure-based virtual screening protocol comprised of e-pharmacophore models and virtual screening work-flow was used to identify nine compounds from a commercial database containing 2.84 million compounds. An ATP non-competitive and selective thieno[3,2-c]quinolin-4(5H)-one inhibitor (10) with ligand efficiency (LE) of 0.3 was identified as the lead molecule. Further SAR optimization led to the discovery of several low micromolar inhibitors with good selectivity. The research represents a new class of potent ATP non-competitive CDK5/p25 inhibitors with good CDK2/E selectivity. PMID:25438765
Informatics applied to cytology
Hornish, Maryanne; Goulart, Robert A.
2008-01-01
Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory. PMID:19495402
Sengupta, Nandini; Nanavati, Sonal; Cericola, Maria; Simon, Lisa
2017-10-01
We have integrated preventive oral health measures into preventive care visits for children at a federally qualified health center in Boston, Massachusetts. The program, started in 2015, covers 3400 children and has increased universal caries risk screening in primary care to 85%, fluoride varnish application rates to 80%, and referrals to a dental home to 35%. We accomplished this by minimizing pressures on providers' workflow, empowering medical assistants to lead the initiative, and utilizing data-driven improvement strategies, alongside colocated coordinated care.
Disparity between Clinical and Ultrasound Examinations in Neonatal Hip Screening.
Kyung, Bong Soo; Lee, Soon Hyuck; Jeong, Woong Kyo; Park, Si Young
2016-06-01
For early detection of developmental dysplasia of the hip (DDH), neonatal hip screening using clinical examination and/or ultrasound has been recommended. Although there have been many studies on the reliability of both screening techniques, there is still controversy in the screening strategies; clinical vs. selective or universal ultrasound screening. To determine the screening strategy, we assessed the agreement among the methods; clinical examination by an experienced pediatric orthopedic surgeon, sonographic morphology, and sonographic stability. From January 2004 to June 2009, a single experienced pediatric orthopedic surgeon performed clinical hip screenings for 2,686 infants in the neonatal unit and 43 infants who were referred due to impressions of hip dysplasia before 3 months of age. Among them, 156 clinically unstable or high-risk babies selectively received bilateral hip ultrasound examinations performed by the same surgeon using the modified Graf method. The results were analyzed statistically to detect any correlations between the clinical and sonographic findings. Although a single experienced orthopedic surgeon conducted all examinations, we detected only a limited relationship between the results of clinical and ultrasound examinations. Ninety-three percent of the clinically subluxatable hips were normal or immature based on static ultrasound examination, and 74% of dislocating hips and 67% of limited abduction hips presented with the morphology below Graf IIa. A total of 80% of clinically subluxatable, 42% of dislocating and 67% of limited abduction hips appeared stable or exhibited minor instability on dynamic ultrasound examination. About 7% of clinically normal hips were abnormal upon ultrasound examination; 5% showed major instability and 3% showed dysplasia above Graf IIc. Clinical stability had small coefficients between ultrasound examinations; 0.39 for sonographic stability and 0.37 for sonographic morphology. Between sonographic stability and morphology, although 71% of hips with major instability showed normal or immature morphology according to static ultrasound examination, the coefficient was as high as 0.64. Discrepancies between clinical and ultrasound examinations were present even if almost all of the exams were performed by a single experienced pediatric orthopedic surgeon. In relation to screening for DDH, it is recommended that both sonographic morphology and stability be checked in addition to clinical examination.
Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J
2012-01-01
Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881
Kobayashi, Takehiko; Sasaki, Mariko
2017-01-01
The ribosomal RNA gene (rDNA) is the most abundant gene in yeast and other eukaryotic organisms. Due to its heavy transcription, repetitive structure and programmed replication fork pauses, the rDNA is one of the most unstable regions in the genome. Thus, the rDNA is the best region to study the mechanisms responsible for maintaining genome integrity. Recently, we screened a library of ∼4800 budding yeast gene knockout strains to identify mutants defective in the maintenance of rDNA stability. The results of this screen are summarized in the Yeast rDNA Stability (YRS) Database, in which the stability and copy number of rDNA in each mutant are presented. From this screen, we identified ∼700 genes that may contribute to the maintenance of rDNA stability. In addition, ∼50 mutants had abnormally high or low rDNA copy numbers. Moreover, some mutants with unstable rDNA displayed abnormalities in another chromosome. In this review, we introduce the YRS Database and discuss the roles of newly identified genes that contribute to rDNA maintenance and genome integrity. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction
NASA Astrophysics Data System (ADS)
Chrysanthis, Panos K.
1996-12-01
Computer Science Department, University of Pittsburgh, Pittsburgh, PA 15260, USA This special issue focuses on current efforts to represent and support workflows that integrate information systems and human resources within a business or manufacturing enterprise. Workflows may also be viewed as an emerging computational paradigm for effective structuring of cooperative applications involving human users and access to diverse data types not necessarily maintained by traditional database management systems. A workflow is an automated organizational process (also called business process) which consists of a set of activities or tasks that need to be executed in a particular controlled order over a combination of heterogeneous database systems and legacy systems. Within workflows, tasks are performed cooperatively by either human or computational agents in accordance with their roles in the organizational hierarchy. The challenge in facilitating the implementation of workflows lies in developing efficient workflow management systems. A workflow management system (also called workflow server, workflow engine or workflow enactment system) provides the necessary interfaces for coordination and communication among human and computational agents to execute the tasks involved in a workflow and controls the execution orderings of tasks as well as the flow of data that these tasks manipulate. That is, the workflow management system is responsible for correctly and reliably supporting the specification, execution, and monitoring of workflows. The six papers selected (out of the twenty-seven submitted for this special issue of Distributed Systems Engineering) address different aspects of these three functional components of a workflow management system. In the first paper, `Correctness issues in workflow management', Kamath and Ramamritham discuss the important issue of correctness in workflow management that constitutes a prerequisite for the use of workflows in the automation of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity. The fourth and fifth papers are experience papers describing a workflow management system and a large scale workflow application, respectively. Schill and Mittasch, in `Workflow management systems on top of OSF DCE and OMG CORBA', describe a decentralized workflow management system and discuss its implementation using two standardized middleware platforms, namely, OSF DCE and OMG CORBA. The system supports a new approach to workflow management, introducing several new concepts such as data type management for integrating various types of data and quality of service for various services provided by servers. A problem common to both database applications and workflows is the handling of missing and incomplete information. This is particularly pervasive in an `electronic market' with a huge number of retail outlets producing and exchanging volumes of data, the application discussed in `Information flow in the DAMA project beyond database managers: information flow managers'. Motivated by the need for a method that allows a task to proceed in a timely manner if not all data produced by other tasks are available by its deadline, Russell et al propose an architectural framework and a language that can be used to detect, approximate and, later on, to adjust missing data if necessary. The final paper, `The evolution towards flexible workflow systems' by Nutt, is complementary to the other papers and is a survey of issues and of work related to both workflow and computer supported collaborative work (CSCW) areas. In particular, the paper provides a model and a categorization of the dimensions which workflow management and CSCW systems share. Besides summarizing the recent advancements towards efficient workflow management, the papers in this special issue suggest areas open to investigation and it is our hope that they will also provide the stimulus for further research and development in the area of workflow management systems.
Biowep: a workflow enactment portal for bioinformatics applications.
Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano
2007-03-08
The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.
Biowep: a workflow enactment portal for bioinformatics applications
Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano
2007-01-01
Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563
The Early Screening Profiles: A Stability Study.
ERIC Educational Resources Information Center
Smith, Douglas K.; And Others
Stability of the Early Screening Profiles (ESP), developed by P. Harrison, was examined with a sample of 23 non-handicapped preschool children (14 females and 9 males) ranging in age from 3 years 0 months to 6 years 0 months at the time of initial testing. The sample was drawn from a rural/suburban community in the midwest with a predominantly…
Walsh, Kristin E; Chui, Michelle Anne; Kieser, Mara A; Williams, Staci M; Sutter, Susan L; Sutter, John G
2011-01-01
To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign.
Salib, Mina; Hoffmann, Raymond G; Dasgupta, Mahua; Zimmerman, Haydee; Hanson, Sheila
2015-10-01
Studies showing the changes in workflow during transition from semi to full electronic medical records are lacking. This objective study is to identify the changes in workflow in the PICU during transition from semi to full electronic health record. Prospective observational study. Children's Hospital of Wisconsin Institutional Review Board waived the need for approval so this study was institutional review board exempt. This study measured clinical workflow variables at a 72-bed PICU during different phases of transition to a full electronic health record, which occurred on November 4, 2012. Phases of electronic health record transition were defined as follows: pre-electronic health record (baseline data prior to transition to full electronic health record), transition phase (3 wk after electronic health record), and stabilization (6 mo after electronic health record). Data were analyzed for the three phases using Mann-Whitney U test with a two-sided p value of less than 0.05 considered significant. Seventy-two bed PICU. All patients in the PICU were included during the study periods. Five hundred and sixty-four patients with 2,355 patient days were evaluated in the three phases. Duration of rounds decreased from a median of 9 minutes per patient pre--electronic health record to 7 minutes per patient post electronic health record. Time to final note decreased from 2.06 days pre--electronic health record to 0.5 days post electronic health record. Time to first medication administration after admission also decreased from 33 minutes pre--electronic health record and 7 minutes post electronic health record. Time to Time to medication reconciliation was significantly higher pre-electronic health record than post electronic health record and percent of medication reconciliation completion was significantly lower pre--electronic health record than post electronic health record and percent of medication reconciliation completion was significantly higher pre--electronic health record than. There was no significant change in time between placement of discharge order and physical transfer from the unit [corrected].changes clinical workflow in a PICU with decreased duration of rounds, time to final note, time to medication administration, and time to medication reconciliation completion. There was no change in the duration from medical to physical transfer.
Generic worklist handler for workflow-enabled products
NASA Astrophysics Data System (ADS)
Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas
1999-07-01
Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.
Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.
Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel
2014-01-01
With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...
2016-07-21
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Ren, Fuzheng; Sun, Hanjing; Cui, Lin; Si, Yike; Chen, Ning; Ren, Guobin; Jing, Qiufang
2018-06-01
Drugs in amorphous solid dispersions (ASDs) are highly dispersed in hydrophilic polymeric carriers, which also help to restrain recrystallization and stabilize the ASDs. In this study, microscopic observation after antisolvent recrystallization was developed as a rapid screening method to select appropriate polymers for the initial design filgotinib (FTN) ASDs. Using solvent evaporation, FTN ASDs with the polymers were prepared, and accelerated experimentation validated this screening method. Fourier-transform infrared spectroscopy, Raman scattering, and nuclear magnetic resonance revealed hydrogen-bonding formation in the drug-polymer binary system, which was critical for ASDs stabilization. A Flory-Huggins interaction parameter and water sorption isotherms were applied to evaluate the strength of the interaction between FTN and the polymers. The dissolution rate was also significantly improved by ASDs formulation, and the presence of the polymers exerted solubilization effects. These results suggested the efficacy of this screening method as a preliminary tool for polymer selection in ASDs design. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Dalecki, Alex G; Wolschendorf, Frank
2016-07-01
Facing totally resistant bacteria, traditional drug discovery efforts have proven to be of limited use in replenishing our depleted arsenal of therapeutic antibiotics. Recently, the natural anti-bacterial properties of metal ions in synergy with metal-coordinating ligands have shown potential for generating new molecule candidates with potential therapeutic downstream applications. We recently developed a novel combinatorial screening approach to identify compounds with copper-dependent anti-bacterial properties. Through a parallel screening technique, the assay distinguishes between copper-dependent and independent activities against Mycobacterium tuberculosis with hits being defined as compounds with copper-dependent activities. These activities must then be linked to a compound master list to process and analyze the data and to identify the hit molecules, a labor intensive and mistake-prone analysis. Here, we describe a software program built to automate this analysis in order to streamline our workflow significantly. We conducted a small, 1440 compound screen against M. tuberculosis and used it as an example framework to build and optimize the software. Though specifically adapted to our own needs, it can be readily expanded for any small- to medium-throughput screening effort, parallel or conventional. Further, by virtue of the underlying Linux server, it can be easily adapted for chemoinformatic analysis of screens through packages such as OpenBabel. Overall, this setup represents an easy-to-use solution for streamlining processing and analysis of biological screening data, as well as offering a scaffold for ready functionality expansion. Copyright © 2016 Elsevier B.V. All rights reserved.
Standardizing clinical trials workflow representation in UML for international site comparison.
de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo
2010-11-09
With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.
Standardizing Clinical Trials Workflow Representation in UML for International Site Comparison
de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M. O.; Rodrigues, Maria J.; Shah, Jatin; Loures, Marco R.; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo
2010-01-01
Background With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Methods Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Results Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. Conclusions This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows. PMID:21085484
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
Unified Software Solution for Efficient SPR Data Analysis in Drug Research
Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan
2016-01-01
Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754
Quantitative workflow based on NN for weighting criteria in landfill suitability mapping
NASA Astrophysics Data System (ADS)
Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul
2017-10-01
Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.
Chen, Chen; Liu, Xiaohui; Zheng, Weimin; Zhang, Lei; Yao, Jun; Yang, Pengyuan
2014-04-04
To completely annotate the human genome, the task of identifying and characterizing proteins that currently lack mass spectrometry (MS) evidence is inevitable and urgent. In this study, as the first effort to screen missing proteins in large scale, we developed an approach based on SDS-PAGE followed by liquid chromatography-multiple reaction monitoring (LC-MRM), for screening of those missing proteins with only a single peptide hit in the previous liver proteome data set. Proteins extracted from normal human liver were separated in SDS-PAGE and digested in split gel slice, and the resulting digests were then subjected to LC-schedule MRM analysis. The MRM assays were developed through synthesized crude peptides for target peptides. In total, the expressions of 57 target proteins were confirmed from 185 MRM assays in normal human liver tissues. Among the proved 57 one-hit wonders, 50 proteins are of the minimally redundant set in the PeptideAtlas database, 7 proteins even have none MS-based information previously in various biological processes. We conclude that our SDS-PAGE-MRM workflow can be a powerful approach to screen missing or poorly characterized proteins in different samples and to provide their quantity if detected. The MRM raw data have been uploaded to ISB/SRM Atlas/PASSEL (PXD000648).
Biomining active cellulases from a mining bioremediation system.
Mewis, Keith; Armstrong, Zachary; Song, Young C; Baldwin, Susan A; Withers, Stephen G; Hallam, Steven J
2013-09-20
Functional metagenomics has emerged as a powerful method for gene model validation and enzyme discovery from natural and human engineered ecosystems. Here we report development of a high-throughput functional metagenomic screen incorporating bioinformatic and biochemical analyses features. A fosmid library containing 6144 clones sourced from a mining bioremediation system was screened for cellulase activity using 2,4-dinitrophenyl β-cellobioside, a previously proven cellulose model substrate. Fifteen active clones were recovered and fully sequenced revealing 9 unique clones with the ability to hydrolyse 1,4-β-D-glucosidic linkages. Transposon mutagenesis identified genes belonging to glycoside hydrolase (GH) 1, 3, or 5 as necessary for mediating this activity. Reference trees for GH 1, 3, and 5 families were generated from sequences in the CAZy database for automated phylogenetic analysis of fosmid end and active clone sequences revealing known and novel cellulase encoding genes. Active cellulase genes recovered in functional screens were subcloned into inducible high copy plasmids, expressed and purified to determine enzymatic properties including thermostability, pH optima, and substrate specificity. The workflow described here provides a general paradigm for recovery and characterization of microbially derived genes and gene products based on genetic logic and contemporary screening technologies developed for model organismal systems. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Fu, Yanqing; Zhou, Zhihui; Kong, Hongwei; Lu, Xin; Zhao, Xinjie; Chen, Yihui; Chen, Jia; Wu, Zeming; Xu, Zhiliang; Zhao, Chunxia; Xu, Guowang
2016-09-06
Identification of illegal additives in complex matrixes is important in the food safety field. In this study a nontargeted screening strategy was developed to find illegal additives based on ultrahigh-performance liquid chromatography-high-resolution mass spectrometry (UHPLC-HRMS). First, an analytical method for possible illegal additives in complex matrixes was established including fast sample pretreatment, accurate UHPLC separation, and HRMS detection. Second, efficient data processing and differential analysis workflow were suggested and applied to find potential risk compounds. Third, structure elucidation of risk compounds was performed by (1) searching online databases [Metlin and the Human Metabolome Database (HMDB)] and an in-house database which was established at the above-defined conditions of UHPLC-HRMS analysis and contains information on retention time, mass spectra (MS), and tandem mass spectra (MS/MS) of 475 illegal additives, (2) analyzing fragment ions, and (3) referring to fragmentation rules. Fish was taken as an example to show the usefulness of the nontargeted screening strategy, and six additives were found in suspected fish samples. Quantitative analysis was further carried out to determine the contents of these compounds. The satisfactory application of this strategy in fish samples means that it can also be used in the screening of illegal additives in other kinds of food samples.
Favretto, M E; Krieg, A; Schubert, S; Schubert, U S; Brock, R
2015-07-10
Polymer-based gene delivery systems have enormous potential in biomedicine, but their efficiency is often limited by poor biocompatibility. Poly(methacrylate)s (PMAs) are an interesting class of polymers which allow to explore structure-activity relationships of polymer functionalities for polyplex formation in oligonucleotide delivery. Here, we synthesized and tested a library of PMA polymers, containing functional groups contributing to the different steps of gene delivery, from oligonucleotide complexation to cellular internalization and endosomal escape. By variation of the molar ratios of the individual building blocks, the physicochemical properties of the polymers and polyplexes were fine-tuned to reduce toxicity as well as to increase activity of the polyplexes. To further enhance transfection efficiency, a cell-penetrating peptide (CPP)-like functionality was introduced on the polymeric backbone. With the ability to synthesize large libraries of polymers in parallel we also developed a workflow for a mid-to-high throughput screening, focusing first on safety parameters that are accessible by high-throughput approaches such as blood compatibility and toxicity towards host cells and only at a later stage on more laborious tests for the ability to deliver oligonucleotides. To arrive at a better understanding of the molecular basis of activity, furthermore, the effect of the presence of heparan sulfates on the surface of host cells was assessed and the mechanism of cell entry and intracellular trafficking investigated for those polymers that showed a suitable pharmacological profile. Following endocytic uptake, rapid endosomal release occurred. Interestingly, the presence of heparan sulfates on the cell surface had a negative impact on the activity of those polyplexes that were sensitive to decomplexation by heparin in solution. In summary, the screening approach identified two polymers, which form polyplexes with high stability and transfection capacity exceeding the one of poly(ethylene imine) also in the presence of serum. Copyright © 2015 Elsevier B.V. All rights reserved.
Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms
Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel
2017-01-01
With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237
Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludascher, Bertram; Altintas, Ilkay
Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.
Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.
2012-01-01
Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459
Emergency department overcrowding: the impact of resource scarcity on physician job satisfaction.
Rondeau, Kent V; Francescutti, Louis H
2005-01-01
Emergency departments in most developed countries have been experiencing significant overcrowding under a regime of severe resource constraints. Physicians in emergency departments increasingly find themselves toiling in workplaces that are characterized by diminished availability of, limited access to, and decreased stability of critical resources. Severe resource constraints have the potential to greatly weaken the overall job satisfaction of emergency physicians. This article examines the impact of hospital resource constraints on the job satisfaction of a large sample of emergency physicians in Canada. After controlling for workflow and patient characteristics and for various institutional and physician characteristics, institutional resource constraints are found to be major contributors to emergency physician job dissatisfaction. Resource factors that have the greatest impact on job satisfaction include availability of emergency room physicians, access to hospital technology and emergency beds, and stability of financial (investment) resources.
Ablinger, Elisabeth; Hellweger, Monika; Leitgeb, Stefan; Zimmer, Andreas
2012-10-15
In this study, we combined a high-throughput screening method, differential scanning fluorimetry (DSF), with design of experiments (DoE) methodology to evaluate the effects of several formulation components on the thermostability of granulocyte colony stimulating factor (G-CSF). First we performed a primary buffer screening where we tested thermal stability of G-CSF in different buffers, pH values and buffer concentrations. The significance of each factor and the two-way interactions between them were studied by multivariable regression analysis. pH was identified as most critical factor regarding thermal stability. The most stabilizing buffer, sodium glutamate, and sodium acetate were determined for further investigations. Second we tested the effect of 6 naturally occurring extremolytes (trehalose, sucrose, ectoine, hydroxyectoine, sorbitol, mannitol) on the thermal stability of G-CSF, using a central composite circumscribed design. At low pH (3.8) and low buffer concentration (5 mM) all extremolytes led to a significant increase in thermal stability except the addition of ectoine which resulted in a strong destabilization of G-CSF. Increasing pH and buffer concentration led to an increase in thermal stability with all investigated extremolytes. The described systematic approach allowed to create a ranking of stabilizing extremolytes at different buffer conditions. Copyright © 2012. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Pan, Tianheng
2018-01-01
In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.
NASA Astrophysics Data System (ADS)
McCarthy, Ann
2006-01-01
The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.
Patel, Ashaben; Erb, Steven M; Strange, Linda; Shukla, Ravi S; Kumru, Ozan S; Smith, Lee; Nelson, Paul; Joshi, Sangeeta B; Livengood, Jill A; Volkin, David B
2018-05-24
A combination experimental approach, utilizing semi-empirical excipient screening followed by statistical modeling using design of experiments (DOE), was undertaken to identify stabilizing candidate formulations for a lyophilized live attenuated Flavivirus vaccine candidate. Various potential pharmaceutical compounds used in either marketed or investigative live attenuated viral vaccine formulations were first identified. The ability of additives from different categories of excipients, either alone or in combination, were then evaluated for their ability to stabilize virus against freeze-thaw, freeze-drying, and accelerated storage (25°C) stresses by measuring infectious virus titer. An exploratory data analysis and predictive DOE modeling approach was subsequently undertaken to gain a better understanding of the interplay between the key excipients and stability of virus as well as to determine which combinations were interacting to improve virus stability. The lead excipient combinations were identified and tested for stabilizing effects using a tetravalent mixture of viruses in accelerated and real time (2-8°C) stability studies. This work demonstrates the utility of combining semi-empirical excipient screening and DOE experimental design strategies in the formulation development of lyophilized live attenuated viral vaccine candidates. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hofmann, Philipp; Sedlmair, Martin; Krauss, Bernhard; Wichmann, Julian L.; Bauer, Ralf W.; Flohr, Thomas G.; Mahnken, Andreas H.
2016-03-01
Osteoporosis is a degenerative bone disease usually diagnosed at the manifestation of fragility fractures, which severely endanger the health of especially the elderly. To ensure timely therapeutic countermeasures, noninvasive and widely applicable diagnostic methods are required. Currently the primary quantifiable indicator for bone stability, bone mineral density (BMD), is obtained either by DEXA (Dual-energy X-ray absorptiometry) or qCT (quantitative CT). Both have respective advantages and disadvantages, with DEXA being considered as gold standard. For timely diagnosis of osteoporosis, another CT-based method is presented. A Dual Energy CT reconstruction workflow is being developed to evaluate BMD by evaluating lumbar spine (L1-L4) DE-CT images. The workflow is ROI-based and automated for practical use. A dual energy 3-material decomposition algorithm is used to differentiate bone from soft tissue and fat attenuation. The algorithm uses material attenuation coefficients on different beam energy levels. The bone fraction of the three different tissues is used to calculate the amount of hydroxylapatite in the trabecular bone of the corpus vertebrae inside a predefined ROI. Calibrations have been performed to obtain volumetric bone mineral density (vBMD) without having to add a calibration phantom or to use special scan protocols or hardware. Accuracy and precision are dependent on image noise and comparable to qCT images. Clinical indications are in accordance with the DEXA gold standard. The decomposition-based workflow shows bone degradation effects normally not visible on standard CT images which would induce errors in normal qCT results.
Radiology information system: a workflow-based approach.
Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P
2009-09-01
Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.
Cornett, Alex; Kuziemsky, Craig
2015-01-01
Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.
Farahani, Navid; Liu, Zheng; Jutt, Dylan; Fine, Jeffrey L
2017-10-01
- Pathologists' computer-assisted diagnosis (pCAD) is a proposed framework for alleviating challenges through the automation of their routine sign-out work. Currently, hypothetical pCAD is based on a triad of advanced image analysis, deep integration with heterogeneous information systems, and a concrete understanding of traditional pathology workflow. Prototyping is an established method for designing complex new computer systems such as pCAD. - To describe, in detail, a prototype of pCAD for the sign-out of a breast cancer specimen. - Deidentified glass slides and data from breast cancer specimens were used. Slides were digitized into whole-slide images with an Aperio ScanScope XT, and screen captures were created by using vendor-provided software. The advanced workflow prototype was constructed by using PowerPoint software. - We modeled an interactive, computer-assisted workflow: pCAD previews whole-slide images in the context of integrated, disparate data and predefined diagnostic tasks and subtasks. Relevant regions of interest (ROIs) would be automatically identified and triaged by the computer. A pathologist's sign-out work would consist of an interactive review of important ROIs, driven by required diagnostic tasks. The interactive session would generate a pathology report automatically. - Using animations and real ROIs, the pCAD prototype demonstrates the hypothetical sign-out in a stepwise fashion, illustrating various interactions and explaining how steps can be automated. The file is publicly available and should be widely compatible. This mock-up is intended to spur discussion and to help usher in the next era of digitization for pathologists by providing desperately needed and long-awaited automation.
Ciemins, Elizabeth L; Coon, Patricia J; Fowles, Jinnet Briggs; Min, Sung-joon
2009-05-01
Electronic health records (EHRs) have been implemented throughout the United States with varying degrees of success. Past EHR implementation experiences can inform health systems planning to initiate new or expand existing EHR systems. Key "critical success factors," e.g., use of disease registries, workflow integration, and real-time clinical guideline support, have been identified but not fully tested in practice. A pre/postintervention cohort analysis was conducted on 495 adult patients selected randomly from a diabetes registry and followed for 6 years. Two intervention phases were evaluated: a "low-dose" period targeting primary care provider (PCP) and patient education followed by a "high-dose" EHR diabetes management implementation period, including a diabetes disease registry and office workflow changes, e.g., diabetes patient preidentification to facilitate real-time diabetes preventive care, disease management, and patient education. Across baseline, "low-dose," and "high-dose" postintervention periods, a significantly greater proportion of patients (a) achieved American Diabetes Association (ADA) guidelines for control of blood pressure (26.9 to 33.1 to 43.9%), glycosylated hemoglobin (48.5 to 57.5 to 66.8%), and low-density lipoprotein cholesterol (33.1 to 44.4 to 56.6%) and (b) received recommended preventive eye (26.2 to 36.4 to 58%), foot (23.4 to 40.3 to 66.9%), and renal (38.5 to 53.9 to 71%) examinations or screens. Implementation of a fully functional, specialized EHR combined with tailored office workflow process changes was associated with increased adherence to ADA guidelines, including risk factor control, by PCPs and their patients with diabetes. Incorporation of previously identified "critical success factors" potentially contributed to the success of the program, as did use of a two-phase approach. 2009 Diabetes Technology Society.
Ranasinghe, Asoka; Ramanathan, Ragu; Jemal, Mohammed; D'Arienzo, Celia J; Humphreys, W Griffith; Olah, Timothy V
2012-03-01
UHPLC coupled with orthogonal acceleration hybrid quadrupole-TOF (Q-TOF)-MS is an emerging technique offering new strategies for the efficient screening of new chemical entities and related molecules at the early discovery stage within the pharmaceutical industry. In the first part of this article, we examine the main instrumental parameters that are critical for the integration of UHPLC-Q-TOF technology to existing bioanalytical workflows, in order to provide simultaneous quantitative and qualitative bioanalysis of samples generated following in vivo studies. Three modern Q-TOF mass spectrometers, including Bruker maXis™, Agilent 6540 and Sciex TripleTOF™ 5600, all interfaced with UHPLC systems, are evaluated in the second part of the article. The scope of this work is to demonstrate the potential of Q-TOF for the analysis of typical small molecules, therapeutic peptides (molecular weight <6000 Da), and enzymatically (i.e., trypsin, chymotrypsin and pepsin) cleaved peptides from larger proteins. This work focuses mainly on full-scan TOF data obtained under ESI conditions, the major mode of TOF operation in discovery bioanalytical research, where the compounds are selected based on their pharmacokinetic/pharmacodynamic behaviors using animal models prior to selecting a few desirable candidates for further development. Finally, important emerging TOF technologies that could potentially benefit bioanalytical research in the semi-quantification of metabolites without synthesized standards are discussed. Particularly, the utility of captive spray ionization coupled with TripleTOF 5600 was evaluated for improving sensitivity and providing normalized MS response for drugs and their metabolites. The workflow proposed compromises neither the efficiency, nor the quality of pharmacokinetic data in support of early drug discovery programs.
A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.
Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish
2014-08-01
Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P < .0001), with higher sensitivity (85% vs. 69%) and specificity (94% vs. 73%) for margin prediction in a 5 × 5-mm grid analysis. A significantly better alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Conceptual-level workflow modeling of scientific experiments using NMR as a case study
Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R
2007-01-01
Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870
FAST: A fully asynchronous and status-tracking pattern for geoprocessing services orchestration
NASA Astrophysics Data System (ADS)
Wu, Huayi; You, Lan; Gui, Zhipeng; Gao, Shuang; Li, Zhenqiang; Yu, Jingmin
2014-09-01
Geoprocessing service orchestration (GSO) provides a unified and flexible way to implement cross-application, long-lived, and multi-step geoprocessing service workflows by coordinating geoprocessing services collaboratively. Usually, geoprocessing services and geoprocessing service workflows are data and/or computing intensive. The intensity feature may make the execution process of a workflow time-consuming. Since it initials an execution request without blocking other interactions on the client side, an asynchronous mechanism is especially appropriate for GSO workflows. Many critical problems remain to be solved in existing asynchronous patterns for GSO including difficulties in improving performance, status tracking, and clarifying the workflow structure. These problems are a challenge when orchestrating performance efficiency, making statuses instantly available, and constructing clearly structured GSO workflows. A Fully Asynchronous and Status-Tracking (FAST) pattern that adopts asynchronous interactions throughout the whole communication tier of a workflow is proposed for GSO. The proposed FAST pattern includes a mechanism that actively pushes the latest status to clients instantly and economically. An independent proxy was designed to isolate the status tracking logic from the geoprocessing business logic, which assists the formation of a clear GSO workflow structure. A workflow was implemented in the FAST pattern to simulate the flooding process in the Poyang Lake region. Experimental results show that the proposed FAST pattern can efficiently tackle data/computing intensive geoprocessing tasks. The performance of all collaborative partners was improved due to the asynchronous mechanism throughout communication tier. A status-tracking mechanism helps users retrieve the latest running status of a GSO workflow in an efficient and instant way. The clear structure of the GSO workflow lowers the barriers for geospatial domain experts and model designers to compose asynchronous GSO workflows. Most importantly, it provides better support for locating and diagnosing potential exceptions.
Conceptual-level workflow modeling of scientific experiments using NMR as a case study.
Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R
2007-01-30
Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.
Thermal precipitation fluorescence assay for protein stability screening.
Fan, Junping; Huang, Bo; Wang, Xianping; Zhang, Xuejun C
2011-09-01
A simple and reliable method of protein stability assessment is desirable for high throughput expression screening of recombinant proteins. Here we described an assay termed thermal precipitation fluorescence (TPF) which can be used to compare thermal stabilities of recombinant protein samples directly from cell lysate supernatants. In this assay, target membrane proteins are expressed as recombinant fusions with a green fluorescence protein tag and solubilized with detergent, and the fluorescence signals are used to report the quantity of the fusion proteins in the soluble fraction of the cell lysate. After applying a heat shock, insoluble protein aggregates are removed by centrifugation. Subsequently, the amount of remaining protein in the supernatant is quantified by in-gel fluorescence analysis and compared to samples without a heat shock treatment. Over 60 recombinant membrane proteins from Escherichia coli were subject to this screening in the presence and absence of a few commonly used detergents, and the results were analyzed. Because no sophisticated protein purification is required, this TPF technique is suitable to high throughput expression screening of recombinant membrane proteins as well as soluble ones and can be used to prioritize target proteins based on their thermal stabilities for subsequent large scale expression and structural studies. Copyright © 2011 Elsevier Inc. All rights reserved.
Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2016-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971
Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2017-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.
The future of scientific workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Peterka, Tom; Altintas, Ilkay
Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less
NASA Astrophysics Data System (ADS)
Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent
2014-03-01
Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.
Zhang, Fang; Wang, Haoyang; Zhang, Li; Zhang, Jing; Fan, Ruojing; Yu, Chongtian; Wang, Wenwen; Guo, Yinlong
2014-10-01
A strategy for suspected-target screening of pesticide residues in complicated matrices was exploited using gas chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS). The screening workflow followed three key steps of, initial detection, preliminary identification, and final confirmation. The initial detection of components in a matrix was done by a high resolution mass spectrum deconvolution; the preliminary identification of suspected pesticides was based on a special retention index/mass spectrum (RI/MS) library that contained both the first-stage mass spectra (MS(1) spectra) and retention indices; and the final confirmation was accomplished by accurate mass measurements of representative ions with their response ratios from the MS(1) spectra or representative product ions from the second-stage mass spectra (MS(2) spectra). To evaluate the applicability of the workflow in real samples, three matrices of apple, spinach, and scallion, each spiked with 165 test pesticides in a set of concentrations, were selected as the models. The results showed that the use of high-resolution TOF enabled effective extractions of spectra from noisy chromatograms, which was based on a narrow mass window (5 mDa) and suspected-target compounds identified by the similarity match of deconvoluted full mass spectra and filtering of linear RIs. On average, over 74% of pesticides at 50 ng/mL could be identified using deconvolution and the RI/MS library. Over 80% of pesticides at 5 ng/mL or lower concentrations could be confirmed in each matrix using at least two representative ions with their response ratios from the MS(1) spectra. In addition, the application of product ion spectra was capable of confirming suspected pesticides with specificity for some pesticides in complicated matrices. In conclusion, GC-QTOF MS combined with the RI/MS library seems to be one of the most efficient tools for the analysis of suspected-target pesticide residues in complicated matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
The standard-based open workflow system in GeoBrain (Invited)
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Zhao, P.; Deng, M.
2013-12-01
GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial intelligence technology. The GeoBrain workflow system has been used in multiple Earth science applications, including the monitoring of global agricultural drought, the assessment of flood damage, the derivation of national crop condition and progress information, and the detection of nuclear proliferation facilities and events.
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization
Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...
2015-01-01
This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less
Profiling Changes in Histone Post-translational Modifications by Top-Down Mass Spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Mowei; Wu, Si; Stenoien, David L.
Top-down mass spectrometry is a valuable tool for charactering post-translational modifications on histones for understanding of gene control and expression. In this protocol, we describe a top-down workflow using liquid chromatography coupled to mass spectrometry for fast global profiling of changes in histone proteoforms between a wild-type and a mutant of a fungal species. The proteoforms exhibiting different abundances can be subjected to further targeted studies by other mass spectrometric or biochemical assays. This method can be generally adapted for preliminary screening for changes in histone modifications between samples such as wild-type vs. mutant, and control vs. disease.
Metaworkflows and Workflow Interoperability for Heliophysics
NASA Astrophysics Data System (ADS)
Pierantoni, Gabriele; Carley, Eoin P.
2014-06-01
Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They- implement Science Cases (the definition of a scientific challenge) by composing different Basic Workflows. The third and last layer,Iterative Science Workflows, is developed in WSPGRADE. It executes sub-workflows (either Basic or Science Workflows) as parameter sweep jobs to investigate Science Cases on large multiple data sets. So far, this approach has proven fruitful for three Science Cases of which one has been completed and two are still being tested.
Wu, Danny T Y; Smart, Nikolas; Ciemins, Elizabeth L; Lanham, Holly J; Lindberg, Curt; Zheng, Kai
2017-01-01
To develop a workflow-supported clinical documentation system, it is a critical first step to understand clinical workflow. While Time and Motion studies has been regarded as the gold standard of workflow analysis, this method can be resource consuming and its data may be biased due to the cognitive limitation of human observers. In this study, we aimed to evaluate the feasibility and validity of using EHR audit trail logs to analyze clinical workflow. Specifically, we compared three known workflow changes from our previous study with the corresponding EHR audit trail logs of the study participants. The results showed that EHR audit trail logs can be a valid source for clinical workflow analysis, and can provide an objective view of clinicians' behaviors, multi-dimensional comparisons, and a highly extensible analysis framework.
Schweitzer, M; Lasierra, N; Hoerbst, A
2015-01-01
Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.
COMDECOM: predicting the lifetime of screening compounds in DMSO solution.
Zitha-Bovens, Emrin; Maas, Peter; Wife, Dick; Tijhuis, Johan; Hu, Qian-Nan; Kleinöder, Thomas; Gasteiger, Johann
2009-06-01
The technological evolution of the 1990s in both combinatorial chemistry and high-throughput screening created the demand for rapid access to the compound deck to support the screening process. The common strategy within the pharmaceutical industry is to store the screening library in DMSO solution. Several studies have shown that a percentage of these compounds decompose in solution, varying from a few percent of the total to a substantial part of the library. In the COMDECOM (COMpound DECOMposition) project, the compound stability of screening compounds in DMSO solution is monitored in an accelerated thermal, hydrolytic, and oxidative decomposition program. A large database with stability data is collected, and from this database, a predictive model is being developed. The aim of this program is to build an algorithm that can flag compounds that are likely to decompose-information that is considered to be of utmost importance (e.g., in the compound acquisition process and when evaluation screening results of library compounds, as well as in the determination of optimal storage conditions).
A Web-based telemedicine system for diabetic retinopathy screening using digital fundus photography.
Wei, Jack C; Valentino, Daniel J; Bell, Douglas S; Baker, Richard S
2006-02-01
The purpose was to design and implement a Web-based telemedicine system for diabetic retinopathy screening using digital fundus cameras and to make the software publicly available through Open Source release. The process of retinal imaging and case reviewing was modeled to optimize workflow and implement use of computer system. The Web-based system was built on Java Servlet and Java Server Pages (JSP) technologies. Apache Tomcat was chosen as the JSP engine, while MySQL was used as the main database and Laboratory of Neuro Imaging (LONI) Image Storage Architecture, from the LONI-UCLA, as the platform for image storage. For security, all data transmissions were carried over encrypted Internet connections such as Secure Socket Layer (SSL) and HyperText Transfer Protocol over SSL (HTTPS). User logins were required and access to patient data was logged for auditing. The system was deployed at Hubert H. Humphrey Comprehensive Health Center and Martin Luther King/Drew Medical Center of Los Angeles County Department of Health Services. Within 4 months, 1500 images of more than 650 patients were taken at Humphrey's Eye Clinic and successfully transferred to King/Drew's Department of Ophthalmology. This study demonstrates an effective architecture for remote diabetic retinopathy screening.
Wang, Ning-li
2013-11-01
Promoting the control of primary angle-closure glaucoma (PACG) and primary open angle glaucoma (POAG) is most important prevention program of blindness in China. PACG has been incorporated into the prevention program of blindness in China based on the population-based screening studies. However, the clinical screening should be strengthened in POAG. The creation of a series of appropriate technologies suitable for glaucoma prevention and management has been achieved in China, especially for PACG. The technologies have been evaluated in the pilot areas and obtained very good results in China. It is recommended to develop new technology suitable for glaucoma management using the following workflow: research, development, and evaluation by large scale hospitals, and then clinical trial in the pilot areas. After a cost-benefit analysis is made, the new technology can be promoted and applied in clinical practice nationwide. We propose to gradually formed a strategical mode of "screening in township hospitals, intervention in county hospitals, and technical support and tackling in provincial hospitals" in order to improve the level of prevention and treatment of glaucoma and reduce the blindness incidence rate caused by glaucoma.
Evaluating a Modular Decision Support Application for Colorectal Cancer Screening
Diiulio, Julie B.; Borders, Morgan R.; Sushereba, Christen E.; Saleem, Jason J.; Haverkamp, Donald; Imperiale, Thomas F.
2017-01-01
Summary Background There is a need for health information technology evaluation that goes beyond randomized controlled trials to include consideration of usability, cognition, feedback from representative users, and impact on efficiency, data quality, and clinical workflow. This article presents an evaluation illustrating one approach to this need using the Decision-Centered Design framework. Objective To evaluate, through a Decision-Centered Design framework, the ability of the Screening and Surveillance App to support primary care clinicians in tracking and managing colorectal cancer testing. Methods We leveraged two evaluation formats, online and in-person, to obtain feedback from a range primary care clinicians and obtain comparative data. Both the online and in-person evaluations used mock patient data to simulate challenging patient scenarios. Primary care clinicians responded to a series of colorectal cancer-related questions about each patient and made recommendations for screening. We collected data on performance, perceived workload, and usability. Key elements of Decision-Centered Design include evaluation in the context of realistic, challenging scenarios and measures designed to explore impact on cognitive performance. Results Comparison of means revealed increases in accuracy, efficiency, and usability and decreases in perceived mental effort and workload when using the Screening and Surveillance App. Conclusion The results speak to the benefits of using the Decision-Centered Design approach in the analysis, design, and evaluation of Health Information Technology. Furthermore, the Screening and Surveillance App shows promise for filling decision support gaps in current electronic health records. PMID:28197619
A three-level atomicity model for decentralized workflow management systems
NASA Astrophysics Data System (ADS)
Ben-Shaul, Israel Z.; Heineman, George T.
1996-12-01
A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.
A Tool Supporting Collaborative Data Analytics Workflow Design and Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Bao, Q.; Lee, T. J.
2016-12-01
Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.
Catomeris, Peter; Baxter, Nancy N; Boss, Sheila C; Paszat, Lawrence F; Rabeneck, Linda; Randell, Edward; Serenity, Mardie L; Sutradhar, Rinku; Tinmouth, Jill
2018-01-01
- Although promising for colorectal cancer screening, hemoglobin (Hb) stability remains a concern with fecal immunochemical tests. This study implemented a novel, standardized method to compare Hb stability across various fecal immunochemical tests. The method can be used to inform decisions when selecting a kit for use in colorectal cancer screening. In so doing, this work addressed a critical need for standardization in this field. - To compare the stability of Hb across 5 different immunochemical kits and one guaiac kit. - The stability of Hb was analyzed in collection devices inoculated with Hb-spiked feces and (1) stored at various temperatures (frozen, refrigerated, ambient, and elevated) for more than 60 days; (2) after undergoing 3 controlled, freeze-thaw cycles; and (3) after being transported by courier or postal services in uncontrolled temperature conditions from 3 locations in Ontario, Canada, to a central testing center. - The stability of Hb varied with time and temperature and by kit. Lower Hb recoveries occurred with increasing temperature and increasing time from sample collection to testing. Refrigeration provided the best stability, although results varied across kits (eg, from 4.2 days to >60 days before a prespecified threshold [<70% probability of the test results remaining positive] was reached). Freeze-thaw stability varied across kits and cycles (Hb recoveries: NS Plus [Alfresa Pharma, Chuo-ku, Osaka, Japan], 91.7% to 95.4%; OC Diana [Eiken Chemical, Taito-ku, Tokyo, Japan], 57.6% to 74.9%). Agreement regarding Hb levels before and after transportation varied across kits (from 57% to 100%). - Important differences in Hb stability were found across the included fecal immunochemical tests. These findings should inform practice-based and population-based colorectal cancer screening.
Jones, Ryan T; Handsfield, Lydia; Read, Paul W; Wilson, David D; Van Ausdal, Ray; Schlesinger, David J; Siebers, Jeffrey V; Chen, Quan
2015-01-01
The clinical challenge of radiation therapy (RT) for painful bone metastases requires clinicians to consider both treatment efficacy and patient prognosis when selecting a radiation therapy regimen. The traditional RT workflow requires several weeks for common palliative RT schedules of 30 Gy in 10 fractions or 20 Gy in 5 fractions. At our institution, we have created a new RT workflow termed "STAT RAD" that allows clinicians to perform computed tomographic (CT) simulation, planning, and highly conformal single fraction treatment delivery within 2 hours. In this study, we evaluate the safety and feasibility of the STAT RAD workflow. A failure mode and effects analysis (FMEA) was performed on the STAT RAD workflow, including development of a process map, identification of potential failure modes, description of the cause and effect, temporal occurrence, and team member involvement in each failure mode, and examination of existing safety controls. A risk probability number (RPN) was calculated for each failure mode. As necessary, workflow adjustments were then made to safeguard failure modes of significant RPN values. After workflow alterations, RPN numbers were again recomputed. A total of 72 potential failure modes were identified in the pre-FMEA STAT RAD workflow, of which 22 met the RPN threshold for clinical significance. Workflow adjustments included the addition of a team member checklist, changing simulation from megavoltage CT to kilovoltage CT, alteration of patient-specific quality assurance testing, and allocating increased time for critical workflow steps. After these modifications, only 1 failure mode maintained RPN significance; patient motion after alignment or during treatment. Performing the FMEA for the STAT RAD workflow before clinical implementation has significantly strengthened the safety and feasibility of STAT RAD. The FMEA proved a valuable evaluation tool, identifying potential problem areas so that we could create a safer workflow. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
μ Opioid receptor: novel antagonists and structural modeling
NASA Astrophysics Data System (ADS)
Kaserer, Teresa; Lantero, Aquilino; Schmidhammer, Helmut; Spetea, Mariana; Schuster, Daniela
2016-02-01
The μ opioid receptor (MOR) is a prominent member of the G protein-coupled receptor family and the molecular target of morphine and other opioid drugs. Despite the long tradition of MOR-targeting drugs, still little is known about the ligand-receptor interactions and structure-function relationships underlying the distinct biological effects upon receptor activation or inhibition. With the resolved crystal structure of the β-funaltrexamine-MOR complex, we aimed at the discovery of novel agonists and antagonists using virtual screening tools, i.e. docking, pharmacophore- and shape-based modeling. We suggest important molecular interactions, which active molecules share and distinguish agonists and antagonists. These results allowed for the generation of theoretically validated in silico workflows that were employed for prospective virtual screening. Out of 18 virtual hits evaluated in in vitro pharmacological assays, three displayed antagonist activity and the most active compound significantly inhibited morphine-induced antinociception. The new identified chemotypes hold promise for further development into neurochemical tools for studying the MOR or as potential therapeutic lead candidates.
Process-driven information management system at a biotech company: concept and implementation.
Gobbi, Alberto; Funeriu, Sandra; Ioannou, John; Wang, Jinyi; Lee, Man-Ling; Palmer, Chris; Bamford, Bob; Hewitt, Robin
2004-01-01
While established pharmaceutical companies have chemical information systems in place to manage their compounds and the associated data, new startup companies need to implement these systems from scratch. Decisions made early in the design phase usually have long lasting effects on the expandability, maintenance effort, and costs associated with the information management system. Careful analysis of work and data flows, both inter- and intradepartmental, and identification of existing dependencies between activities are important. This knowledge is required to implement an information management system, which enables the research community to work efficiently by avoiding redundant registration and processing of data and by timely provision of the data whenever needed. This paper first presents the workflows existing at Anadys, then ARISE, the research information management system developed in-house at Anadys. ARISE was designed to support the preclinical drug discovery process and covers compound registration, analytical quality control, inventory management, high-throughput screening, lower throughput screening, and data reporting.
Osman, Muhammad-Afiq; Neoh, Hui-Min; Ab Mutalib, Nurul-Syakima; Chin, Siok-Fong; Jamal, Rahman
2018-01-01
The human gut holds the densest microbiome ecosystem essential in maintaining a healthy host physiology, whereby disruption of this ecosystem has been linked to the development of colorectal cancer (CRC). The advent of next-generation sequencing technologies such as the 16S rRNA gene sequencing has enabled characterization of the CRC gut microbiome architecture in an affordable and culture-free approach. Nevertheless, the lack of standardization in handling and storage of biospecimens, nucleic acid extraction, 16S rRNA gene primer selection, length, and depth of sequencing and bioinformatics analyses have contributed to discrepancies found in various published studies of this field. Accurate characterization of the CRC microbiome found in different stages of CRC has the potential to be developed into a screening tool in the clinical setting. This mini review aims to concisely compile all available CRC microbiome studies performed till end of 2016 and to suggest standardized protocols that are crucial in developing a gut microbiome screening panel for CRC.
Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.
List, Markus
2017-06-10
Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.
Pathology consultation on urine compliance testing and drug abuse screening.
Ward, Michael B; Hackenmueller, Sarah A; Strathmann, Frederick G
2014-11-01
Compliance testing in pain management requires a distinct approach compared with classic clinical toxicology testing. Differences in the patient populations and clinical expectations require modifications to established reporting cutoffs, assay performance expectations, and critical review of how best to apply the available testing methods. Although other approaches to testing are emerging, immunoassay screening followed by mass spectrometry confirmation remains the most common testing workflow for pain management compliance and drug abuse testing. A case-based approach was used to illustrate the complexities inherent to and uniqueness of pain management compliance testing for both clinicians and laboratories. A basic understanding of the inherent strengths and weaknesses of immunoassays and mass spectrometry provides the clinician a better understanding of how best to approach pain management compliance testing. Pain management compliance testing is a textbook example of an emerging field requiring open communication between physician and performing laboratory to fully optimize patient care. Copyright© by the American Society for Clinical Pathology.
Scott, Daniel J; Kummer, Lutz; Egloff, Pascal; Bathgate, Ross A D; Plückthun, Andreas
2014-11-01
The largest single class of drug targets is the G protein-coupled receptor (GPCR) family. Modern high-throughput methods for drug discovery require working with pure protein, but this has been a challenge for GPCRs, and thus the success of screening campaigns targeting soluble, catalytic protein domains has not yet been realized for GPCRs. Therefore, most GPCR drug screening has been cell-based, whereas the strategy of choice for drug discovery against soluble proteins is HTS using purified proteins coupled to structure-based drug design. While recent developments are increasing the chances of obtaining GPCR crystal structures, the feasibility of screening directly against purified GPCRs in the unbound state (apo-state) remains low. GPCRs exhibit low stability in detergent micelles, especially in the apo-state, over the time periods required for performing large screens. Recent methods for generating detergent-stable GPCRs, however, offer the potential for researchers to manipulate GPCRs almost like soluble enzymes, opening up new avenues for drug discovery. Here we apply cellular high-throughput encapsulation, solubilization and screening (CHESS) to the neurotensin receptor 1 (NTS1) to generate a variant that is stable in the apo-state when solubilized in detergents. This high stability facilitated the crystal structure determination of this receptor and also allowed us to probe the pharmacology of detergent-solubilized, apo-state NTS1 using robotic ligand binding assays. NTS1 is a target for the development of novel antipsychotics, and thus CHESS-stabilized receptors represent exciting tools for drug discovery. Copyright © 2014 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
... Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011-D027... Wide Area WorkFlow (WAWF) and TRICARE Encounter Data System (TEDS). WAWF, which electronically... civil emergencies, when access to Wide Area WorkFlow by those contractors is not feasible; (4) Purchases...
An Auto-management Thesis Program WebMIS Based on Workflow
NASA Astrophysics Data System (ADS)
Chang, Li; Jie, Shi; Weibo, Zhong
An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.
NASA Astrophysics Data System (ADS)
Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.
Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.
Briston, Thomas; Lewis, Sian; Koglin, Mumta; Mistry, Kavita; Shen, Yongchun; Hartopp, Naomi; Katsumata, Ryosuke; Fukumoto, Hironori; Duchen, Michael R.; Szabadkai, Gyorgy; Staddon, James M.; Roberts, Malcolm; Powney, Ben
2016-01-01
Growing evidence suggests persistent mitochondrial permeability transition pore (mPTP) opening is a key pathophysiological event in cell death underlying a variety of diseases. While it has long been clear the mPTP is a druggable target, current agents are limited by off-target effects and low therapeutic efficacy. Therefore identification and development of novel inhibitors is necessary. To rapidly screen large compound libraries for novel mPTP modulators, a method was exploited to cryopreserve large batches of functionally active mitochondria from cells and tissues. The cryopreserved mitochondria maintained respiratory coupling and ATP synthesis, Ca2+ uptake and transmembrane potential. A high-throughput screen (HTS), using an assay of Ca2+-induced mitochondrial swelling in the cryopreserved mitochondria identified ER-000444793, a potent inhibitor of mPTP opening. Further evaluation using assays of Ca2+-induced membrane depolarisation and Ca2+ retention capacity also indicated that ER-000444793 acted as an inhibitor of the mPTP. ER-000444793 neither affected cyclophilin D (CypD) enzymatic activity, nor displaced of CsA from CypD protein, suggesting a mechanism independent of CypD inhibition. Here we identified a novel, CypD-independent inhibitor of the mPTP. The screening approach and compound described provides a workflow and additional tool to aid the search for novel mPTP modulators and to help understand its molecular nature. PMID:27886240
Use of Threshold of Toxicological Concern (TTC) with High ...
Although progress has been made with HTS (high throughput screening) in profiling biological activity (e.g., EPA’s ToxCast™), challenges arise interpreting HTS results in the context of adversity & converting HTS assay concentrations to equivalent human doses for the broad domain of commodity chemicals. Here, we propose using TTC as a risk screening method to evaluate exposure ranges derived from NHANES for 7968 chemicals. Because the well-established TTC approach uses hazard values derived from in vivo toxicity data, relevance to adverse effects is robust. We compared the conservative TTC (non-cancer) value of 90 μg/day (1.5 μg/kg/day) (Kroes et al., Fd Chem Toxicol, 2004) to quantitative exposure predictions of the upper 95% credible interval (UCI) of median daily exposures for 7968 chemicals in 10 different demographic groups (Wambaugh et al., Environ Sci Technol. 48:12760-7, 2014). Results indicate: (1) none of the median values of credible interval of exposure for any chemical in any demographic group was above the TTC; & (2) fewer than 5% of chemicals had an UCI that exceeded the TTC for any group. However, these median exposure predictions do not cover highly exposed (e.g., occupational) populations. Additionally, we propose an expanded risk-based screening workflow that comprises a TTC decision tree that includes screening compounds for structural alerts for DNA reactivity, OPs & carbamates as well as a comparison with bioactivity-based margins of
Guasch, Laura; Sala, Esther; Castell-Auví, Anna; Cedó, Lidia; Liedl, Klaus R.; Wolber, Gerhard; Muehlbacher, Markus; Mulero, Miquel; Pinent, Montserrat; Ardévol, Anna; Valls, Cristina; Pujadas, Gerard; Garcia-Vallvé, Santiago
2012-01-01
Background Although there are successful examples of the discovery of new PPARγ agonists, it has recently been of great interest to identify new PPARγ partial agonists that do not present the adverse side effects caused by PPARγ full agonists. Consequently, the goal of this work was to design, apply and validate a virtual screening workflow to identify novel PPARγ partial agonists among natural products. Methodology/Principal Findings We have developed a virtual screening procedure based on structure-based pharmacophore construction, protein-ligand docking and electrostatic/shape similarity to discover novel scaffolds of PPARγ partial agonists. From an initial set of 89,165 natural products and natural product derivatives, 135 compounds were identified as potential PPARγ partial agonists with good ADME properties. Ten compounds that represent ten new chemical scaffolds for PPARγ partial agonists were selected for in vitro biological testing, but two of them were not assayed due to solubility problems. Five out of the remaining eight compounds were confirmed as PPARγ partial agonists: they bind to PPARγ, do not or only moderately stimulate the transactivation activity of PPARγ, do not induce adipogenesis of preadipocyte cells and stimulate the insulin-induced glucose uptake of adipocytes. Conclusions/Significance We have demonstrated that our virtual screening protocol was successful in identifying novel scaffolds for PPARγ partial agonists. PMID:23226391
Konerman, Monica A; Thomson, Mary; Gray, Kristen; Moore, Meghan; Choxi, Hetal; Seif, Elizabeth; Lok, Anna S F
2017-12-01
Despite effective treatment for chronic hepatitis C, deficiencies in diagnosis and access to care preclude disease elimination. Screening of baby boomers remains low. The aims of this study were to assess the impact of an electronic health record-based prompt on hepatitis C virus (HCV) screening rates in baby boomers in primary care and access to specialty care and treatment among those newly diagnosed. We implemented an electronic health record-based "best practice advisory" (BPA) that prompted primary care providers to perform HCV screening for patients seen in primary care clinic (1) born between 1945 and 1965, (2) who lacked a prior diagnosis of HCV infection, and (3) who lacked prior documented anti-HCV testing. The BPA had associated educational materials, order set, and streamlined access to specialty care for newly diagnosed patients. Pre-BPA and post-BPA screening rates were compared, and care of newly diagnosed patients was analyzed. In the 3 years prior to BPA implementation, 52,660 baby boomers were seen in primary care clinics and 28% were screened. HCV screening increased from 7.6% for patients with a primary care provider visit in the 6 months prior to BPA to 72% over the 1 year post-BPA. Of 53 newly diagnosed patients, all were referred for specialty care, 11 had advanced fibrosis or cirrhosis, 20 started treatment, and 9 achieved sustained virologic response thus far. Implementation of an electronic health record-based prompt increased HCV screening rates among baby boomers in primary care by 5-fold due to efficiency in determining needs for HCV screening and workflow design. Streamlined access to specialty care enabled patients with previously undiagnosed advanced disease to be cured. This intervention can be easily integrated into electronic health record systems to increase HCV diagnosis and linkage to care. (Hepatology 2017;66:1805-1813). © 2017 by the American Association for the Study of Liver Diseases.
Structuring clinical workflows for diabetes care: an overview of the OntoHealth approach.
Schweitzer, M; Lasierra, N; Oberbichler, S; Toma, I; Fensel, A; Hoerbst, A
2014-01-01
Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view.
Structuring Clinical Workflows for Diabetes Care
Lasierra, N.; Oberbichler, S.; Toma, I.; Fensel, A.; Hoerbst, A.
2014-01-01
Summary Background Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. Objectives The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. Methods A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. Results This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. Conclusions The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view. PMID:25024765
Modelling and analysis of workflow for lean supply chains
NASA Astrophysics Data System (ADS)
Ma, Jinping; Wang, Kanliang; Xu, Lida
2011-11-01
Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.
Validation of a coupled core-transport, pedestal-structure, current-profile and equilibrium model
NASA Astrophysics Data System (ADS)
Meneghini, O.
2015-11-01
The first workflow capable of predicting the self-consistent solution to the coupled core-transport, pedestal structure, and equilibrium problems from first-principles and its experimental tests are presented. Validation with DIII-D discharges in high confinement regimes shows that the workflow is capable of robustly predicting the kinetic profiles from on axis to the separatrix and matching the experimental measurements to within their uncertainty, with no prior knowledge of the pedestal height nor of any measurement of the temperature or pressure. Self-consistent coupling has proven to be essential to match the experimental results, and capture the non-linear physics that governs the core and pedestal solutions. In particular, clear stabilization of the pedestal peeling ballooning instabilities by the global Shafranov shift and destabilization by additional edge bootstrap current, and subsequent effect on the core plasma profiles, have been clearly observed and documented. In our model, self-consistency is achieved by iterating between the TGYRO core transport solver (with NEO and TGLF for neoclassical and turbulent flux), and the pedestal structure predicted by the EPED model. A self-consistent equilibrium is calculated by EFIT, while the ONETWO transport package evolves the current profile and calculates the particle and energy sources. The capabilities of such workflow are shown to be critical for the design of future experiments such as ITER and FNSF, which operate in a regime where the equilibrium, the pedestal, and the core transport problems are strongly coupled, and for which none of these quantities can be assumed to be known. Self-consistent core-pedestal predictions for ITER, as well as initial optimizations, will be presented. Supported by the US Department of Energy under DE-FC02-04ER54698, DE-SC0012652.
Advances in Global Full Waveform Inversion
NASA Astrophysics Data System (ADS)
Tromp, J.; Bozdag, E.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Modrak, R. T.; Orsvuran, R.; Smith, J. A.; Komatitsch, D.; Peter, D. B.
2017-12-01
Information about Earth's interior comes from seismograms recorded at its surface. Seismic imaging based on spectral-element and adjoint methods has enabled assimilation of this information for the construction of 3D (an)elastic Earth models. These methods account for the physics of wave excitation and propagation by numerically solving the equations of motion, and require the execution of complex computational procedures that challenge the most advanced high-performance computing systems. Current research is petascale; future research will require exascale capabilities. The inverse problem consists of reconstructing the characteristics of the medium from -often noisy- observations. A nonlinear functional is minimized, which involves both the misfit to the measurements and a Tikhonov-type regularization term to tackle inherent ill-posedness. Achieving scalability for the inversion process on tens of thousands of multicore processors is a task that offers many research challenges. We initiated global "adjoint tomography" using 253 earthquakes and produced the first-generation model named GLAD-M15, with a transversely isotropic model parameterization. We are currently running iterations for a second-generation anisotropic model based on the same 253 events. In parallel, we continue iterations for a transversely isotropic model with a larger dataset of 1,040 events to determine higher-resolution plume and slab images. A significant part of our research has focused on eliminating I/O bottlenecks in the adjoint tomography workflow. This has led to the development of a new Adaptable Seismic Data Format based on HDF5, and post-processing tools based on the ADIOS library developed by Oak Ridge National Laboratory. We use the Ensemble Toolkit for workflow stabilization & management to automate the workflow with minimal human interaction.
myExperiment: a repository and social network for the sharing of bioinformatics workflows
Goble, Carole A.; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David
2010-01-01
myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org. PMID:20501605
Novak, Laurie L; Johnson, Kevin B; Lorenzi, Nancy M
2010-01-01
The objective of this review was to describe methods used to study and model workflow. The authors included studies set in a variety of industries using qualitative, quantitative and mixed methods. Of the 6221 matching abstracts, 127 articles were included in the final corpus. The authors collected data from each article on researcher perspective, study type, methods type, specific methods, approaches to evaluating quality of results, definition of workflow and dependent variables. Ethnographic observation and interviews were the most frequently used methods. Long study durations revealed the large time commitment required for descriptive workflow research. The most frequently discussed technique for evaluating quality of study results was triangulation. The definition of the term “workflow” and choice of methods for studying workflow varied widely across research areas and researcher perspectives. The authors developed a conceptual framework of workflow-related terminology for use in future research and present this model for use by other researchers. PMID:20442143
Digitization workflows for flat sheets and packets of plants, algae, and fungi1
Nelson, Gil; Sweeney, Patrick; Wallace, Lisa E.; Rabeler, Richard K.; Allard, Dorothy; Brown, Herrick; Carter, J. Richard; Denslow, Michael W.; Ellwood, Elizabeth R.; Germain-Aubrey, Charlotte C.; Gilbert, Ed; Gillespie, Emily; Goertzen, Leslie R.; Legler, Ben; Marchant, D. Blaine; Marsico, Travis D.; Morris, Ashley B.; Murrell, Zack; Nazaire, Mare; Neefus, Chris; Oberreiter, Shanna; Paul, Deborah; Ruhfel, Brad R.; Sasek, Thomas; Shaw, Joey; Soltis, Pamela S.; Watson, Kimberly; Weeks, Andrea; Mast, Austin R.
2015-01-01
Effective workflows are essential components in the digitization of biodiversity specimen collections. To date, no comprehensive, community-vetted workflows have been published for digitizing flat sheets and packets of plants, algae, and fungi, even though latest estimates suggest that only 33% of herbarium specimens have been digitally transcribed, 54% of herbaria use a specimen database, and 24% are imaging specimens. In 2012, iDigBio, the U.S. National Science Foundation’s (NSF) coordinating center and national resource for the digitization of public, nonfederal U.S. collections, launched several working groups to address this deficiency. Here, we report the development of 14 workflow modules with 7–36 tasks each. These workflows represent the combined work of approximately 35 curators, directors, and collections managers representing more than 30 herbaria, including 15 NSF-supported plant-related Thematic Collections Networks and collaboratives. The workflows are provided for download as Portable Document Format (PDF) and Microsoft Word files. Customization of these workflows for specific institutional implementation is encouraged. PMID:26421256
Vij, Rajesh; Lin, Zhonghua; Chiang, Nancy; Vernes, Jean-Michel; Storek, Kelly M; Park, Summer; Chan, Joyce; Meng, Y Gloria; Comps-Agrar, Laetitia; Luan, Peng; Lee, Sophia; Schneider, Kellen; Bevers, Jack; Zilberleyb, Inna; Tam, Christine; Koth, Christopher M; Xu, Min; Gill, Avinash; Auerbach, Marcy R; Smith, Peter A; Rutherford, Steven T; Nakamura, Gerald; Seshasayee, Dhaya; Payandeh, Jian; Koerber, James T
2018-05-08
Outer membrane proteins (OMPs) in Gram-negative bacteria are essential for a number of cellular functions including nutrient transport and drug efflux. Escherichia coli BamA is an essential component of the OMP β-barrel assembly machinery and a potential novel antibacterial target that has been proposed to undergo large (~15 Å) conformational changes. Here, we explored methods to isolate anti-BamA monoclonal antibodies (mAbs) that might alter the function of this OMP and ultimately lead to bacterial growth inhibition. We first optimized traditional immunization approaches but failed to identify mAbs that altered cell growth after screening >3000 hybridomas. We then developed a "targeted boost-and-sort" strategy that combines bacterial cell immunizations, purified BamA protein boosts, and single hybridoma cell sorting using amphipol-reconstituted BamA antigen. This unique workflow improves the discovery efficiency of FACS + mAbs by >600-fold and enabled the identification of rare anti-BamA mAbs with bacterial growth inhibitory activity in the presence of a truncated lipopolysaccharide layer. These mAbs represent novel tools for dissecting the BamA-mediated mechanism of β-barrel folding and our workflow establishes a new template for the efficient discovery of novel mAbs against other highly dynamic membrane proteins.
Rezeli, Melinda; Sjödin, Karin; Lindberg, Henrik; Gidlöf, Olof; Lindahl, Bertil; Jernberg, Tomas; Spaak, Jonas; Erlinge, David; Marko-Varga, György
2017-09-01
A multiple reaction monitoring (MRM) assay was developed for precise quantitation of 87 plasma proteins including the three isoforms of apolipoprotein E (APOE) associated with cardiovascular diseases using nanoscale liquid chromatography separation and stable isotope dilution strategy. The analytical performance of the assay was evaluated and we found an average technical variation of 4.7% in 4-5 orders of magnitude dynamic range (≈0.2 mg/L to 4.5 g/L) from whole plasma digest. Here, we report a complete workflow, including sample processing adapted to 96-well plate format and normalization strategy for large-scale studies. To further investigate the MS-based quantitation the amount of six selected proteins was measured by routinely used clinical chemistry assays as well and the two methods showed excellent correlation with high significance (p-value < 10e-5) for the six proteins, in addition for the cardiovascular predictor factor, APOB: APOA1 ratio (r = 0.969, p-value < 10e-5). Moreover, we utilized the developed assay for screening of biobank samples from patients with myocardial infarction and performed the comparative analysis of patient groups with STEMI (ST- segment elevation myocardial infarction), NSTEMI (non ST- segment elevation myocardial infarction) and type-2 AMI (type-2 myocardial infarction) patients.
SIMPLEX: Cloud-Enabled Pipeline for the Comprehensive Analysis of Exome Sequencing Data
Fischer, Maria; Snajder, Rene; Pabinger, Stephan; Dander, Andreas; Schossig, Anna; Zschocke, Johannes; Trajanoski, Zlatko; Stocker, Gernot
2012-01-01
In recent studies, exome sequencing has proven to be a successful screening tool for the identification of candidate genes causing rare genetic diseases. Although underlying targeted sequencing methods are well established, necessary data handling and focused, structured analysis still remain demanding tasks. Here, we present a cloud-enabled autonomous analysis pipeline, which comprises the complete exome analysis workflow. The pipeline combines several in-house developed and published applications to perform the following steps: (a) initial quality control, (b) intelligent data filtering and pre-processing, (c) sequence alignment to a reference genome, (d) SNP and DIP detection, (e) functional annotation of variants using different approaches, and (f) detailed report generation during various stages of the workflow. The pipeline connects the selected analysis steps, exposes all available parameters for customized usage, performs required data handling, and distributes computationally expensive tasks either on a dedicated high-performance computing infrastructure or on the Amazon cloud environment (EC2). The presented application has already been used in several research projects including studies to elucidate the role of rare genetic diseases. The pipeline is continuously tested and is publicly available under the GPL as a VirtualBox or Cloud image at http://simplex.i-med.ac.at; additional supplementary data is provided at http://www.icbi.at/exome. PMID:22870267
Martinez-Garcia, Elena; Lesur, Antoine; Devis, Laura; Campos, Alexandre; Cabrera, Silvia; van Oostrum, Jan; Matias-Guiu, Xavier; Gil-Moreno, Antonio; Reventos, Jaume; Colas, Eva; Domon, Bruno
2016-08-16
About 30% of endometrial cancer (EC) patients are diagnosed at an advanced stage of the disease, which is associated with a drastic decrease in the 5-year survival rate. The identification of biomarkers in uterine aspirate samples, which are collected by a minimally invasive procedure, would improve early diagnosis of EC. We present a sequential workflow to select from a list of potential EC biomarkers, those which are the most promising to enter a validation study. After the elimination of confounding contributions by residual blood proteins, 52 potential biomarkers were analyzed in uterine aspirates from 20 EC patients and 18 non-EC controls by a high-resolution accurate mass spectrometer operated in parallel reaction monitoring mode. The differential abundance of 26 biomarkers was observed, and among them ten proteins showed a high sensitivity and specificity (AUC > 0.9). The study demonstrates that uterine aspirates are valuable samples for EC protein biomarkers screening. It also illustrates the importance of a biomarker verification phase to fill the gap between discovery and validation studies and highlights the benefits of high resolution mass spectrometry for this purpose. The proteins verified in this study have an increased likelihood to become a clinical assay after a subsequent validation phase.
Disruption of Radiologist Workflow.
Kansagra, Akash P; Liu, Kevin; Yu, John-Paul J
2016-01-01
The effect of disruptions has been studied extensively in surgery and emergency medicine, and a number of solutions-such as preoperative checklists-have been implemented to enforce the integrity of critical safety-related workflows. Disruptions of the highly complex and cognitively demanding workflow of modern clinical radiology have only recently attracted attention as a potential safety hazard. In this article, we describe the variety of disruptions that arise in the reading room environment, review approaches that other specialties have taken to mitigate workflow disruption, and suggest possible solutions for workflow improvement in radiology. Copyright © 2015 Mosby, Inc. All rights reserved.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
NASA Astrophysics Data System (ADS)
Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio
2015-04-01
The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.
Alekseychyk, Larysa; Su, Cheng; Becker, Gerald W; Treuheit, Michael J; Razinkov, Vladimir I
2014-10-01
Selection of a suitable formulation that provides adequate product stability is an important aspect of the development of biopharmaceutical products. Stability of proteins includes not only resistance to chemical modifications but also conformational and colloidal stabilities. While chemical degradation of antibodies is relatively easy to detect and control, propensity for conformational changes and/or aggregation during manufacturing or long-term storage is difficult to predict. In many cases, the formulation factors that increase one type of stability may significantly decrease another type under the same or different conditions. Often compromise is necessary to minimize the adverse effects of an antibody formulation by careful optimization of multiple factors responsible for overall stability. In this study, high-throughput stress and characterization techniques were applied to 96 formulations of anti-streptavidin antibodies (an IgG1 and an IgG2) to choose optimal formulations. Stress and analytical methods applied in this study were 96-well plate based using an automated liquid handling system to prepare the different formulations and sample plates. Aggregation and clipping propensity were evaluated by temperature and mechanical stresses. Multivariate regression analysis of high-throughput data was performed to find statistically significant formulation factors that alter measured parameters such as monomer percentage or unfolding temperature. The results of the regression models were used to maximize the stabilities of antibodies under different formulations and to find the optimal formulation space for each molecule. Comparison of the IgG1 and IgG2 data indicated an overall greater stability of the IgG1 molecule under the conditions studied. The described method can easily be applied to both initial preformulation screening and late-stage formulation development of biopharmaceutical products. © 2014 Society for Laboratory Automation and Screening.
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
WE-D-207-01: Background and Clinical Implementation of a Screening Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aberle, D.
2015-06-15
In the United States, Lung Cancer is responsible for more cancer deaths than the next four cancers combined. In addition, the 5 year survival rate for lung cancer patients has not improved over the past 40 to 50 years. To combat this deadly disease, in 2002 the National Cancer Institute launched a very large Randomized Control Trial called the National Lung Screening Trial (NLST). This trial would randomize subjects who had substantial risk of lung cancer (due to age and smoking history) into either a Chest X-ray arm or a low dose CT arm. In November 2010, the National Cancermore » Institute announced that the NLST had demonstrated 20% fewer lung cancer deaths among those who were screened with low-dose CT than with chest X-ray. In December 2013, the US Preventive Services Task Force recommended the use of Lung Cancer Screening using low dose CT and a little over a year later (Feb. 2015), CMS announced that Medicare would also cover Lung Cancer Screening using low dose CT. Thus private and public insurers are required to provide Lung Cancer Screening programs using CT to the appropriate population(s). The purpose of this Symposium is to inform medical physicists and prepare them to support the implementation of Lung Screening programs. This Symposium will focus on the clinical aspects of lung cancer screening, requirements of a screening registry for systematically capturing and tracking screening patients and results (such as required Medicare data elements) as well as the role of the medical physicist in screening programs, including the development of low dose CT screening protocols. Learning Objectives: To understand the clinical basis and clinical components of a lung cancer screening program, including eligibility criteria and other requirements. To understand the data collection requirements, workflow, and informatics infrastructure needed to support the tracking and reporting components of a screening program. To understand the role of the medical physicist in implementing Lung Cancer Screening protocols for CT, including utilizing resources such as the AAPM Protocols and the ACR Designated Lung Screening Center program. UCLA Department of Radiology has an Institutional research agreement with Siemens Healthcare; Dr. McNitt-Gray has been a recipient of Research Support from Siemens Healthcare in the past. Dr. Aberle has been a Member of Advisory Boards for the LUNGevity Foundation (2011-present) and Siemens Medical Solutions. (2013)« less
Dynamic reusable workflows for ocean science
Signell, Richard; Fernandez, Filipe; Wilcox, Kyle
2016-01-01
Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic notebooks across the geoscience domains.
Ou-Yang, Si-sheng; Lu, Jun-yan; Kong, Xiang-qian; Liang, Zhong-jie; Luo, Cheng; Jiang, Hualiang
2012-01-01
Computational drug discovery is an effective strategy for accelerating and economizing drug discovery and development process. Because of the dramatic increase in the availability of biological macromolecule and small molecule information, the applicability of computational drug discovery has been extended and broadly applied to nearly every stage in the drug discovery and development workflow, including target identification and validation, lead discovery and optimization and preclinical tests. Over the past decades, computational drug discovery methods such as molecular docking, pharmacophore modeling and mapping, de novo design, molecular similarity calculation and sequence-based virtual screening have been greatly improved. In this review, we present an overview of these important computational methods, platforms and successful applications in this field. PMID:22922346
Deploying and sharing U-Compare workflows as web services.
Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia
2013-02-18
U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.
Deploying and sharing U-Compare workflows as web services
2013-01-01
Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. Results We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. Conclusions The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform. PMID:23419017
Johnson, Kevin B; Lorenzi, Nancy M
2011-01-01
Objective The goal of this study was to develop an in-depth understanding of how a health information exchange (HIE) fits into clinical workflow at multiple clinical sites. Materials and Methods The ethnographic qualitative study was conducted over a 9-month period in six emergency departments (ED) and eight ambulatory clinics in Memphis, Tennessee, USA. Data were collected using direct observation, informal interviews during observation, and formal semi-structured interviews. The authors observed for over 180 h, during which providers used the exchange 130 times. Results HIE-related workflow was modeled for each ED site and ambulatory clinic group and substantial site-to-site workflow differences were identified. Common patterns in HIE-related workflow were also identified across all sites, leading to the development of two role-based workflow models: nurse based and physician based. The workflow elements framework was applied to the two role-based patterns. An in-depth description was developed of how providers integrated HIE into existing clinical workflow, including prompts for HIE use. Discussion Workflow differed substantially among sites, but two general role-based HIE usage models were identified. Although providers used HIE to improve continuity of patient care, patient–provider trust played a significant role. Types of information retrieved related to roles, with nurses seeking to retrieve recent hospitalization data and more open-ended usage by nurse practitioners and physicians. User and role-specific customization to accommodate differences in workflow and information needs may increase the adoption and use of HIE. Conclusion Understanding end users' perspectives towards HIE technology is crucial to the long-term success of HIE. By applying qualitative methods, an in-depth understanding of HIE usage was developed. PMID:22003156
Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.
Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A
2005-04-07
Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.
Provenance-Powered Automatic Workflow Generation and Composition
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.
2015-12-01
In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.
Support for Taverna workflows in the VPH-Share cloud platform.
Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F
2017-07-01
To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.
Identifying impact of software dependencies on replicability of biomedical workflows.
Miksa, Tomasz; Rauber, Andreas; Mina, Eleni
2016-12-01
Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.
Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat
2016-11-28
At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.
Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat
2016-03-01
At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.
A Model of Workflow Composition for Emergency Management
NASA Astrophysics Data System (ADS)
Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu
The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.
A Framework for Modeling Workflow Execution by an Interdisciplinary Healthcare Team.
Kezadri-Hamiaz, Mounira; Rosu, Daniela; Wilk, Szymon; Kuziemsky, Craig; Michalowski, Wojtek; Carrier, Marc
2015-01-01
The use of business workflow models in healthcare is limited because of insufficient capture of complexities associated with behavior of interdisciplinary healthcare teams that execute healthcare workflows. In this paper we present a novel framework that builds on the well-founded business workflow model formalism and related infrastructures and introduces a formal semantic layer that describes selected aspects of team dynamics and supports their real-time operationalization.
Sava, M Gabriela; Dolan, James G; May, Jerrold H; Vargas, Luis G
2018-07-01
Current colorectal cancer screening guidelines by the US Preventive Services Task Force endorse multiple options for average-risk patients and recommend that screening choices should be guided by individual patient preferences. Implementing these recommendations in practice is challenging because they depend on accurate and efficient elicitation and assessment of preferences from patients who are facing a novel task. To present a methodology for analyzing the sensitivity and stability of a patient's preferences regarding colorectal cancer screening options and to provide a starting point for a personalized discussion between the patient and the health care provider about the selection of the appropriate screening option. This research is a secondary analysis of patient preference data collected as part of a previous study. We propose new measures of preference sensitivity and stability that can be used to determine if additional information provided would result in a change to the initially most preferred colorectal cancer screening option. Illustrative results of applying the methodology to the preferences of 2 patients, of different ages, are provided. The results show that different combinations of screening options are viable for each patient and that the health care provider should emphasize different information during the medical decision-making process. Sensitivity and stability analysis can supply health care providers with key topics to focus on when communicating with a patient and the degree of emphasis to place on each of them to accomplish specific goals. The insights provided by the analysis can be used by health care providers to approach communication with patients in a more personalized way, by taking into consideration patients' preferences before adding their own expertise to the discussion.
Design and implementation of a secure workflow system based on PKI/PMI
NASA Astrophysics Data System (ADS)
Yan, Kai; Jiang, Chao-hui
2013-03-01
As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
In the United States, Lung Cancer is responsible for more cancer deaths than the next four cancers combined. In addition, the 5 year survival rate for lung cancer patients has not improved over the past 40 to 50 years. To combat this deadly disease, in 2002 the National Cancer Institute launched a very large Randomized Control Trial called the National Lung Screening Trial (NLST). This trial would randomize subjects who had substantial risk of lung cancer (due to age and smoking history) into either a Chest X-ray arm or a low dose CT arm. In November 2010, the National Cancermore » Institute announced that the NLST had demonstrated 20% fewer lung cancer deaths among those who were screened with low-dose CT than with chest X-ray. In December 2013, the US Preventive Services Task Force recommended the use of Lung Cancer Screening using low dose CT and a little over a year later (Feb. 2015), CMS announced that Medicare would also cover Lung Cancer Screening using low dose CT. Thus private and public insurers are required to provide Lung Cancer Screening programs using CT to the appropriate population(s). The purpose of this Symposium is to inform medical physicists and prepare them to support the implementation of Lung Screening programs. This Symposium will focus on the clinical aspects of lung cancer screening, requirements of a screening registry for systematically capturing and tracking screening patients and results (such as required Medicare data elements) as well as the role of the medical physicist in screening programs, including the development of low dose CT screening protocols. Learning Objectives: To understand the clinical basis and clinical components of a lung cancer screening program, including eligibility criteria and other requirements. To understand the data collection requirements, workflow, and informatics infrastructure needed to support the tracking and reporting components of a screening program. To understand the role of the medical physicist in implementing Lung Cancer Screening protocols for CT, including utilizing resources such as the AAPM Protocols and the ACR Designated Lung Screening Center program. UCLA Department of Radiology has an Institutional research agreement with Siemens Healthcare; Dr. McNitt-Gray has been a recipient of Research Support from Siemens Healthcare in the past. Dr. Aberle has been a Member of Advisory Boards for the LUNGevity Foundation (2011-present) and Siemens Medical Solutions. (2013)« less
NASA Astrophysics Data System (ADS)
Whittaker, Kara A.; McShane, Dan
2013-02-01
A large storm event in southwest Washington State triggered over 2500 landslides and provided an opportunity to assess two slope stability screening tools. The statistical analysis conducted demonstrated that both screening tools are effective at predicting where landslides were likely to take place (Whittaker and McShane, 2012). Here we reply to two discussions of this article related to the development of the slope stability screening tools and the accuracy and scale of the spatial data used. Neither of the discussions address our statistical analysis or results. We provide greater detail on our sampling criteria and also elaborate on the policy and management implications of our findings and how they complement those of a separate investigation of landslides resulting from the same storm. The conclusions made in Whittaker and McShane (2012) stand as originally published unless future analysis indicates otherwise.
Simple and fast screening of G-quadruplex ligands with electrochemical detection system.
Fan, Qiongxuan; Li, Chao; Tao, Yaqin; Mao, Xiaoxia; Li, Genxi
2016-11-01
Small molecules that may facilitate and stabilize the formation of G-quadruplexes can be used for cancer treatments, because the G-quadruplex structure can inhibit the activity of telomerase, an enzyme over-expressed in many cancer cells. Therefore, there is considerable interest in developing a simple and high-performance method for screening small molecules binding to G-quadruplex. Here, we have designed a simple electrochemical approach to screen such ligands based on the fact that the formation and stabilization of G-quadruplex by ligand may inhibit electron transfer of redox species to electrode surface. As a proof-of-concept study, two types of classical G-quadruplex ligands, TMPyP4 and BRACO-19, are studied in this work, which demonstrates that this method is fast and robust and it may be applied to screen G-quadruplex ligands for anticancer drugs testing and design in the future. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Leibovici, D. G.; Pourabdollah, A.; Jackson, M.
2011-12-01
Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK
Li, Minghui; Goncearenco, Alexander; Panchenko, Anna R
2017-01-01
In this review we describe a protocol to annotate the effects of missense mutations on proteins, their functions, stability, and binding. For this purpose we present a collection of the most comprehensive databases which store different types of sequencing data on missense mutations, we discuss their relationships, possible intersections, and unique features. Next, we suggest an annotation workflow using the state-of-the art methods and highlight their usability, advantages, and limitations for different cases. Finally, we address a particularly difficult problem of deciphering the molecular mechanisms of mutations on proteins and protein complexes to understand the origins and mechanisms of diseases.
A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking.
Alessi, Lauren J; Warmus, Holly R; Schaffner, Erin K; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M
2018-03-01
Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children's hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods are organized in a plan-do-study-act cycle. During the "plan" phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the "do" phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45-8.2% and mortality range of 8.2-25% (Table 2). 1-5 . Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement.
Asad, Sedigheh; Dastgheib, Seyed Mohammad Mehdi; Khajeh, Khosro
2016-11-01
Horseradish peroxidase (HRP) with a variety of potential biotechnological applications is still isolated from the horseradish root as a mixture of different isoenzymes with different biochemical properties. There is an increasing demand for preparations of high amounts of pure enzyme but its recombinant production is limited because of the lack of glycosylation in Escherichia coli and different glycosylation patterns in yeasts which affects its stability parameters. The goal of this study was to increase the stability of non-glycosylated enzyme, which is produced in E. coli, toward hydrogen peroxide via mutagenesis. Asparagine 268, one of the N-glycosylation sites of the enzyme, has been mutated via saturation mutagenesis using the megaprimer method. Modification and miniaturization of previously described protocols enabled screening of a library propagated in E. coli XJb (DE3). The library of mutants was screened for stability toward hydrogen peroxide with azinobis (ethylbenzthiazoline sulfonate) as a reducing substrate. Asn268Gly mutant, the top variant from the screening, exhibited 18-fold increased stability toward hydrogen peroxide and twice improved thermal stability compared with the recombinant HRP. Moreover, the substitution led to 2.5-fold improvement in the catalytic efficiency with phenol/4-aminoantipyrine. Constructed mutant represents a stable biocatalyst, which may find use in medical diagnostics, biosensing, and bioprocesses. © 2015 International Union of Biochemistry and Molecular Biology, Inc.
A targeted metabolomics approach for clinical diagnosis of inborn errors of metabolism.
Jacob, Minnie; Malkawi, Abeer; Albast, Nour; Al Bougha, Salam; Lopata, Andreas; Dasouki, Majed; Abdel Rahman, Anas M
2018-09-26
Metabolome, the ultimate functional product of the genome, can be studied through identification and quantification of small molecules. The global metabolome influences the individual phenotype through clinical and environmental interventions. Metabolomics has become an integral part of clinical research and allowed for another dimension of better understanding of disease pathophysiology and mechanism. More than 95% of the clinical biochemistry laboratory routine workload is based on small molecular identification, which can potentially be analyzed through metabolomics. However, multiple challenges in clinical metabolomics impact the entire workflow and data quality, thus the biological interpretation needs to be standardized for a reproducible outcome. Herein, we introduce the establishment of a comprehensive targeted metabolomics method for a panel of 220 clinically relevant metabolites using Liquid chromatography-tandem mass spectrometry (LC-MS/MS) standardized for clinical research. The sensitivity, reproducibility and molecular stability of each targeted metabolite (amino acids, organic acids, acylcarnitines, sugars, bile acids, neurotransmitters, polyamines, and hormones) were assessed under multiple experimental conditions. The metabolic tissue distribution was determined in various rat organs. Furthermore, the method was validated in dry blood spot (DBS) samples collected from patients known to have various inborn errors of metabolism (IEMs). Using this approach, our panel appears to be sensitive and robust as it demonstrated differential and unique metabolic profiles in various rat tissues. Also, as a prospective screening method, this panel of diverse metabolites has the ability to identify patients with a wide range of IEMs who otherwise may need multiple, time-consuming and expensive biochemical assays causing a delay in clinical management. Copyright © 2018 Elsevier B.V. All rights reserved.
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
Powers, Christina M; Mills, Karmann A; Morris, Stephanie A; Klaessig, Fred; Gaheen, Sharon; Lewinski, Nastassja
2015-01-01
Summary There is a critical opportunity in the field of nanoscience to compare and integrate information across diverse fields of study through informatics (i.e., nanoinformatics). This paper is one in a series of articles on the data curation process in nanoinformatics (nanocuration). Other articles in this series discuss key aspects of nanocuration (temporal metadata, data completeness, database integration), while the focus of this article is on the nanocuration workflow, or the process of identifying, inputting, and reviewing nanomaterial data in a data repository. In particular, the article discusses: 1) the rationale and importance of a defined workflow in nanocuration, 2) the influence of organizational goals or purpose on the workflow, 3) established workflow practices in other fields, 4) current workflow practices in nanocuration, 5) key challenges for workflows in emerging fields like nanomaterials, 6) examples to make these challenges more tangible, and 7) recommendations to address the identified challenges. Throughout the article, there is an emphasis on illustrating key concepts and current practices in the field. Data on current practices in the field are from a group of stakeholders active in nanocuration. In general, the development of workflows for nanocuration is nascent, with few individuals formally trained in data curation or utilizing available nanocuration resources (e.g., ISA-TAB-Nano). Additional emphasis on the potential benefits of cultivating nanomaterial data via nanocuration processes (e.g., capability to analyze data from across research groups) and providing nanocuration resources (e.g., training) will likely prove crucial for the wider application of nanocuration workflows in the scientific community. PMID:26425437
Implementing bioinformatic workflows within the bioextract server
USDA-ARS?s Scientific Manuscript database
Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...
O'Connor, S; McCaffrey, N; Whyte, E; Moran, K
2016-07-01
To adapt the trunk stability test to facilitate further sub-classification of higher levels of core stability in athletes for use as a screening tool. To establish the inter-tester and intra-tester reliability of this adapted core stability test. Reliability study. Collegiate athletic therapy facilities. Fifteen physically active male subjects (19.46 ± 0.63) free from any orthopaedic or neurological disorders were recruited from a convenience sample of collegiate students. The intraclass correlation coefficients (ICC) and 95% Confidence Intervals (CI) were computed to establish inter-tester and intra-tester reliability. Excellent ICC values were observed in the adapted core stability test for inter-tester reliability (0.97) and good to excellent intra-tester reliability (0.73-0.90). While the 95% CI were narrow for inter-tester reliability, Tester A and C 95% CI's were widely distributed compared to Tester B. The adapted core stability test developed in this study is a quick and simple field based test to administer that can further subdivide athletes with high levels of core stability. The test demonstrated high inter-tester and intra-tester reliability. Copyright © 2015 Elsevier Ltd. All rights reserved.
Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel
2012-04-01
High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.
Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.
Hartman, Douglas J
2015-06-01
Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2015 Elsevier Inc. All rights reserved.
Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.
Hartman, Douglas J
2016-03-01
Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2016 Elsevier Inc. All rights reserved.
Comparison of Computer-based Clinical Decision Support Systems and Content for Diabetes Mellitus.
Kantor, M; Wright, A; Burton, M; Fraser, G; Krall, M; Maviglia, S; Mohammed-Rajput, N; Simonaitis, L; Sonnenberg, F; Middleton, B
2011-01-01
Computer-based clinical decision support (CDS) systems have been shown to improve quality of care and workflow efficiency, and health care reform legislation relies on electronic health records and CDS systems to improve the cost and quality of health care in the United States; however, the heterogeneity of CDS content and infrastructure of CDS systems across sites is not well known. We aimed to determine the scope of CDS content in diabetes care at six sites, assess the capabilities of CDS in use at these sites, characterize the scope of CDS infrastructure at these sites, and determine how the sites use CDS beyond individual patient care in order to identify characteristics of CDS systems and content that have been successfully implemented in diabetes care. We compared CDS systems in six collaborating sites of the Clinical Decision Support Consortium. We gathered CDS content on care for patients with diabetes mellitus and surveyed institutions on characteristics of their site, the infrastructure of CDS at these sites, and the capabilities of CDS at these sites. The approach to CDS and the characteristics of CDS content varied among sites. Some commonalities included providing customizability by role or user, applying sophisticated exclusion criteria, and using CDS automatically at the time of decision-making. Many messages were actionable recommendations. Most sites had monitoring rules (e.g. assessing hemoglobin A1c), but few had rules to diagnose diabetes or suggest specific treatments. All sites had numerous prevention rules including reminders for providing eye examinations, influenza vaccines, lipid screenings, nephropathy screenings, and pneumococcal vaccines. Computer-based CDS systems vary widely across sites in content and scope, but both institution-created and purchased systems had many similar features and functionality, such as integration of alerts and reminders into the decision-making workflow of the provider and providing messages that are actionable recommendations.
Shachak, Aviv; Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R; Lemieux-Charles, Louise
2013-06-04
Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avonto, Cristina; Chittiboyina, Amar G.; Rua, Diego
2015-12-01
Skin sensitization is an important toxicological end-point in the risk assessment of chemical allergens. Because of the complexity of the biological mechanisms associated with skin sensitization, integrated approaches combining different chemical, biological and in silico methods are recommended to replace conventional animal tests. Chemical methods are intended to characterize the potential of a sensitizer to induce earlier molecular initiating events. The presence of an electrophilic mechanistic domain is considered one of the essential chemical features to covalently bind to the biological target and induce further haptenation processes. Current in chemico assays rely on the quantification of unreacted model nucleophiles aftermore » incubation with the candidate sensitizer. In the current study, a new fluorescence-based method, ‘HTS-DCYA assay’, is proposed. The assay aims at the identification of reactive electrophiles based on their chemical reactivity toward a model fluorescent thiol. The reaction workflow enabled the development of a High Throughput Screening (HTS) method to directly quantify the reaction adducts. The reaction conditions have been optimized to minimize solubility issues, oxidative side reactions and increase the throughput of the assay while minimizing the reaction time, which are common issues with existing methods. Thirty-six chemicals previously classified with LLNA, DPRA or KeratinoSens™ were tested as a proof of concept. Preliminary results gave an estimated 82% accuracy, 78% sensitivity, 90% specificity, comparable to other in chemico methods such as Cys-DPRA. In addition to validated chemicals, six natural products were analyzed and a prediction of their sensitization potential is presented for the first time. - Highlights: • A novel fluorescence-based method to detect electrophilic sensitizers is proposed. • A model fluorescent thiol was used to directly quantify the reaction products. • A discussion of the reaction workflow and critical parameters is presented. • The method could provide a useful tool to complement existing chemical assays.« less
Wong, Shui Ling; Barner, Jamie C; Sucic, Kristina; Nguyen, Michelle; Rascati, Karen L
To describe the integration and implementation of pharmacy services in patient-centered medical homes (PCMHs) as adopted by federally qualified health centers (FQHCs) and compare them with usual care (UC). Four FQHCs (3 PCMHs, 1 UC) in Austin, TX, that provide care to the underserved populations. Pharmacists have worked under a collaborative practice agreement with internal medicine physicians since 2005. All 4 FQHCs have pharmacists as an integral part of the health care team. Pharmacists have prescriptive authority to initiate and adjust diabetes medications. The PCMH FQHCs instituted co-visits, where patients see both the physician and the pharmacist on the same day. PCMH pharmacists are routinely proactive in collaborating with physicians regarding medication management, compared with UC in which pharmacists see patients only when referred by a physician. Four face-to-face, one-on-one semistructured interviews were conducted with pharmacists working in 3 PCMH FQHCs and 1 UC FQHC to compare the implementation of PCMH with emphasis on 1) structure and workflow, 2) pharmacists' roles, and 3) benefits and challenges. On co-visit days, the pharmacist may see the patient before or after physician consultation. Pharmacists in 2 of the PCMH facilities proactively screen to identify diabetes patients who may benefit from pharmacist services, although the UC clinic pharmacists see only referred patients. Strengths of the co-visit model include more collaboration with physicians and more patient convenience. Payment that recognizes the value of PCMH is one PCMH principle that is not fully implemented. PCMH pharmacists in FQHCs were integrated into the workflow to address specific patient needs. Specifically, full-time in-house pharmacists, flexible referral criteria, proactive screening, well defined collaborative practice agreement, and open scheduling were successful strategies for the underserved populations in this study. However, reimbursement plans and provider status for pharmacists should be established to sustain this model of care. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
SU-E-T-151: Breathing Synchronized Delivery (BSD) Planning for RapicArc Treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, W; Chen, M; Jiang, S
2015-06-15
Purpose: To propose a workflow for breathing synchronized delivery (BSD) planning for RapicArc treatment. Methods: The workflow includes three stages: screening/simulation, planning, and delivery. In the screening/simulation stage, a 4D CT with the corresponding breathing pattern is acquired for each of the selected patients, who are able to follow their own breathing pattern. In the planning stage, one breathing phase is chosen as the reference, and contours are delineated on the reference image. Deformation maps to other phases are performed along with contour propagation. Based on the control points of the initial 3D plan for the reference phase and themore » respiration trace, the correlation with respiration phases, the leaf sequence and gantry angles is determined. The beamlet matrices are calculated with the corresponding breathing phase and deformed to the reference phase. Using the 4D dose evaluation tool and the original 3D plan DVHs criteria, the leaf sequence is further optimized to meet the planning objectives and the machine constraints. In the delivery stage, the patients are instructed to follow the programmed breathing patterns of their own, and all other parts are the same as the conventional Rapid-Arc delivery. Results: Our plan analysis is based on comparison of the 3D plan with a static target (SD), 3D plan with motion delivery (MD), and the BSD plan. Cyclic motion of range 0 cm to 3 cm was simulated for phantoms and lung CT. The gain of the BSD plan over MD is significant and concordant for both simulation and lung 4DCT, indicating the benefits of 4D planning. Conclusion: Our study shows that the BSD plan can approach the SD plan quality. However, such BSD scheme relies on the patient being able to follow the same breathing curve that is used in the planning stage during radiation delivery. Funded by Varian Medical Systems.« less
Integrating prediction, provenance, and optimization into high energy workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schram, M.; Bansal, V.; Friese, R. D.
We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.
Patten, Shunmoogum A.; Aggad, Dina; Martinez, Jose; Tremblay, Elsa; Petrillo, Janet; Armstrong, Gary A.B.; Maios, Claudia; Liao, Meijiang; Ciura, Sorana; Wen, Xiao-Yan; Rafuse, Victor; Ichida, Justin; Zinman, Lorne; Julien, Jean-Pierre; Kabashi, Edor; Robitaille, Richard; Korngut, Lawrence; Parker, J. Alexander
2017-01-01
Amyotrophic lateral sclerosis (ALS) is a rapidly progressing, fatal disorder with no effective treatment. We used simple genetic models of ALS to screen phenotypically for potential therapeutic compounds. We screened libraries of compounds in C. elegans, validated hits in zebrafish, and tested the most potent molecule in mice and in a small clinical trial. We identified a class of neuroleptics that restored motility in C. elegans and in zebrafish, and the most potent was pimozide, which blocked T-type Ca2+ channels in these simple models and stabilized neuromuscular transmission in zebrafish and enhanced it in mice. Finally, a short randomized controlled trial of sporadic ALS subjects demonstrated stabilization of motility and evidence of target engagement at the neuromuscular junction. Simple genetic models are, thus, useful in identifying promising compounds for the treatment of ALS, such as neuroleptics, which may stabilize neuromuscular transmission and prolong survival in this disease. PMID:29202456
Patten, Shunmoogum A; Aggad, Dina; Martinez, Jose; Tremblay, Elsa; Petrillo, Janet; Armstrong, Gary Ab; La Fontaine, Alexandre; Maios, Claudia; Liao, Meijiang; Ciura, Sorana; Wen, Xiao-Yan; Rafuse, Victor; Ichida, Justin; Zinman, Lorne; Julien, Jean-Pierre; Kabashi, Edor; Robitaille, Richard; Korngut, Lawrence; Parker, J Alexander; Drapeau, Pierre
2017-11-16
Amyotrophic lateral sclerosis (ALS) is a rapidly progressing, fatal disorder with no effective treatment. We used simple genetic models of ALS to screen phenotypically for potential therapeutic compounds. We screened libraries of compounds in C. elegans, validated hits in zebrafish, and tested the most potent molecule in mice and in a small clinical trial. We identified a class of neuroleptics that restored motility in C. elegans and in zebrafish, and the most potent was pimozide, which blocked T-type Ca2+ channels in these simple models and stabilized neuromuscular transmission in zebrafish and enhanced it in mice. Finally, a short randomized controlled trial of sporadic ALS subjects demonstrated stabilization of motility and evidence of target engagement at the neuromuscular junction. Simple genetic models are, thus, useful in identifying promising compounds for the treatment of ALS, such as neuroleptics, which may stabilize neuromuscular transmission and prolong survival in this disease.
The Efficiency of Different Salts to Screen Charge Interactions in Proteins: A Hofmeister Effect?
Perez-Jimenez, Raul; Godoy-Ruiz, Raquel; Ibarra-Molero, Beatriz; Sanchez-Ruiz, Jose M.
2004-01-01
Understanding the screening by salts of charge-charge interactions in proteins is important for at least two reasons: a), screening by intracellular salt concentration may modulate the stability and interactions of proteins in vivo; and b), the in vitro experimental estimation of the contributions from charge-charge interactions to molecular processes involving proteins is generally carried out on the basis of the salt effect on process energetics, under the assumption that these interactions are screened out by moderate salt concentrations. Here, we explore experimentally the extent to which the screening efficiency depends on the nature of the salt. To this end, we have carried out an energetic characterization of the effect of NaCl (a nondenaturing salt), guanidinium chloride (a denaturing salt), and guanidinium thiocyanate (a stronger denaturant) on the stability of the wild-type form and a T14K variant of Escherichia coli thioredoxin. Our results suggest that the efficiency of different salts to screen charge-charge interactions correlates with their denaturing strength and with the position of the constituent ions in the Hofmeister rankings. This result appears consistent with the plausible relation of the Hofmeister rankings with the extent of solute accumulation/exclusion from protein surfaces. PMID:15041679
Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)
NASA Astrophysics Data System (ADS)
Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.
2013-12-01
The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.
Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco
2014-01-01
One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.
van der Heijden, Suzanne; de Oliveira, Susanne Juel; Kampmann, Marie-Louise; Børsting, Claus; Morling, Niels
2017-11-01
The Precision ID Identity Panel was used to type 109 Somali individuals in order to obtain allele frequencies for the Somali population. These frequencies were used to establish a Somali HID-SNP database, which will be used for the biostatistic calculations in family and immigration cases. Genotypes obtained with the Precision ID Identity Panel were found to be almost in complete concordance with genotypes obtained with the SNPforID PCR-SBE-CE assay. In seven SNP loci, silent alleles were identified, of which most were previously described in the literature. The project also set out to compare different AmpliSeq™ workflows to investigate the possibility of using automated library building in forensic genetic case work. In order to do so, the SNP typing of the Somalis was performed using three different workflows: 1) manual library building and sequencing on the Ion PGM™, 2) automated library building using the Biomek ® 3000 and sequencing on the Ion PGM™, and 3) automated library building using the Ion Chef™ and sequencing on the Ion S5™. AmpliSeq™ workflows were compared based on coverage, locus balance, noise, and heterozygote balance. Overall, the Ion Chef™/Ion S5™ workflow was found to give the best results and required least hands-on time in the laboratory. However, the Ion Chef™/Ion S5™ workflow was also the most expensive. The number of libraries that may be constructed in one Ion Chef™ library building run was limited to eight, which is too little for high throughput workflows. The Biomek ® 3000/Ion PGM™ workflow was found to perform similarly to the manual/Ion PGM™ workflow. This argues for the use of automated library building in forensic genetic case work. Automated library building decreases the workload of the laboratory staff, decreases the risk of pipetting errors, and simplifies the daily workflow in forensic genetic laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.
From the desktop to the grid: scalable bioinformatics via workflow conversion.
de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver
2016-03-12
Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.
A scientific workflow framework for (13)C metabolic flux analysis.
Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina
2016-08-20
Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi
2017-01-01
Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
Performance of an Automated Versus a Manual Whole-Body Magnetic Resonance Imaging Workflow.
Stocker, Daniel; Finkenstaedt, Tim; Kuehn, Bernd; Nanz, Daniel; Klarhoefer, Markus; Guggenberger, Roman; Andreisek, Gustav; Kiefer, Berthold; Reiner, Caecilia S
2018-04-24
The aim of this study was to evaluate the performance of an automated workflow for whole-body magnetic resonance imaging (WB-MRI), which reduces user interaction compared with the manual WB-MRI workflow. This prospective study was approved by the local ethics committee. Twenty patients underwent WB-MRI for myopathy evaluation on a 3 T MRI scanner. Ten patients (7 women; age, 52 ± 13 years; body weight, 69.9 ± 13.3 kg; height, 173 ± 9.3 cm; body mass index, 23.2 ± 3.0) were examined with a prototypical automated WB-MRI workflow, which automatically segments the whole body, and 10 patients (6 women; age, 35.9 ± 12.4 years; body weight, 72 ± 21 kg; height, 169.2 ± 10.4 cm; body mass index, 24.9 ± 5.6) with a manual scan. Overall image quality (IQ; 5-point scale: 5, excellent; 1, poor) and coverage of the study volume were assessed by 2 readers for each sequence (coronal T2-weighted turbo inversion recovery magnitude [TIRM] and axial contrast-enhanced T1-weighted [ce-T1w] gradient dual-echo sequence). Interreader agreement was evaluated with intraclass correlation coefficients. Examination time, number of user interactions, and MR technicians' acceptance rating (1, highest; 10, lowest) was compared between both groups. Total examination time was significantly shorter for automated WB-MRI workflow versus manual WB-MRI workflow (30.0 ± 4.2 vs 41.5 ± 3.4 minutes, P < 0.0001) with significantly shorter planning time (2.5 ± 0.8 vs 14.0 ± 7.0 minutes, P < 0.0001). Planning took 8% of the total examination time with automated versus 34% with manual WB-MRI workflow (P < 0.0001). The number of user interactions with automated WB-MRI workflow was significantly lower compared with manual WB-MRI workflow (10.2 ± 4.4 vs 48.2 ± 17.2, P < 0.0001). Planning efforts were rated significantly lower by the MR technicians for the automated WB-MRI workflow than for the manual WB-MRI workflow (2.20 ± 0.92 vs 4.80 ± 2.39, respectively; P = 0.005). Overall IQ was similar between automated and manual WB-MRI workflow (TIRM: 4.00 ± 0.94 vs 3.45 ± 1.19, P = 0.264; ce-T1w: 4.20 ± 0.88 vs 4.55 ± .55, P = 0.423). Interreader agreement for overall IQ was excellent for TIRM and ce-T1w with an intraclass correlation coefficient of 0.95 (95% confidence interval, 0.86-0.98) and 0.88 (95% confidence interval, 0.70-0.95). Incomplete coverage of the thoracic compartment in the ce-T1w sequence occurred more often in the automated WB-MRI workflow (P = 0.008) for reader 2. No other significant differences in the study volume coverage were found. In conclusion, the automated WB-MRI scanner workflow showed a significant reduction of the examination time and the user interaction compared with the manual WB-MRI workflow. Image quality and the coverage of the study volume were comparable in both groups.
An Abbreviated Protocol for High-Risk Screening Breast MRI Saves Time and Resources.
Harvey, Susan C; Di Carlo, Phillip A; Lee, Bonmyong; Obadina, Eniola; Sippo, Dorothy; Mullen, Lisa
2016-04-01
To review the ability of an abbreviated, high-risk, screening, breast MRI protocol to detect cancer and save resources. High-risk screening breast MR images were reviewed, from both an abbreviated protocol and a full diagnostic protocol. Differences in cancer detection, scanner utilization, interpretation times, and need for additional imaging were recorded in an integrated data form, and reviewed and compared. A total of 568 MRI cases were reviewed, with the abbreviated and full protocols. No difference was found in the number of cancers detected. Scan times were decreased by 18.8 minutes per case, for a total of 10,678 minutes (178 hours). Interpretation time, on average, was 1.55 minutes for the abbreviated protocol, compared with 6.43 minutes for the full protocol. Review of the full protocol led to a significant change in the final BI-RADS(®) assessment in 12 of 568 (2.1%) cases. Abbreviated MRI is as effective as full-protocol MRI for demonstration of cancers in the high-risk screening setting, with only 12 (2.1%) cases recommended for additional MRI evaluation. The efficiency and resource savings of an abbreviated protocol would be significant, and would allow for opportunities to provide MRI for additional patients, as well as improved radiologist time management and workflow, with the potential to add real-time MRI interpretation or double reading. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
An Abbreviated Protocol for High-Risk Screening Breast MRI Saves Time and Resources.
Harvey, Susan C; Di Carlo, Phillip A; Lee, Bonmyong; Obadina, Eniola; Sippo, Dorothy; Mullen, Lisa
2016-11-01
To review the ability of an abbreviated, high-risk, screening, breast MRI protocol to detect cancer and save resources. High-risk screening breast MR images were reviewed, from both an abbreviated protocol and a full diagnostic protocol. Differences in cancer detection, scanner utilization, interpretation times, and need for additional imaging were recorded in an integrated data form, and reviewed and compared. A total of 568 MRI cases were reviewed, with the abbreviated and full protocols. No difference was found in the number of cancers detected. Scan times were decreased by 18.8 minutes per case, for a total of 10,678 minutes (178 hours). Interpretation time, on average, was 1.55 minutes for the abbreviated protocol, compared with 6.43 minutes for the full protocol. Review of the full protocol led to a significant change in the final BI-RADS ® assessment in 12 of 568 (2.1%) cases. Abbreviated MRI is as effective as full-protocol MRI for demonstration of cancers in the high-risk screening setting, with only 12 (2.1 %) cases recommended for additional MRI evaluation. The efficiency and resource savings of an abbreviated protocol would be significant, and would allow for opportunities to provide MRI for additional patients, as well as improved radiologist time management and workflow, with the potential to add real-time MRI interpretation or double reading. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Normal Impingement of a Circular Liquid Jet onto a Screen in a Weightless Environment
NASA Technical Reports Server (NTRS)
Symons, E. P.
1976-01-01
The normal impingement of a circular liquid jet onto a fine-mesh screen in a weightless environment was investigated. Equations were developed to predict the velocity of the emerging jet on the downstream side of the screen as a function of screen and liquid parameters and of the velocity of the impinging jet. Additionally, the stability of the emerging jet was found to be Weber number dependent. In general, excepting at high velocities, the screen behaved much as a baffle, deflecting the major portion of the impinging flow.
Neumann, Martin Horst Dieter; Schneck, Helen; Decker, Yvonne; Schömer, Susanne; Franken, André; Endris, Volker; Pfarr, Nicole; Weichert, Wilko; Niederacher, Dieter; Fehm, Tanja; Neubauer, Hans
2017-01-01
Circulating tumor cells (CTC) are rare cells which have left the primary tumor to enter the blood stream. Although only a small CTC subgroup is capable of extravasating, the presence of CTCs is associated with an increased risk of metastasis and a shorter overall survival. Understanding the heterogeneous CTC biology will optimize treatment decisions and will thereby improve patient outcome. For this, robust workflows for detection and isolation of CTCs are urgently required. Here, we present a workflow to characterize CTCs by combining the advantages of both the CellSearch ® and the CellCelector™ micromanipulation system. CTCs were isolated from CellSearch ® cartridges using the CellCelector™ system and were deposited into PCR tubes for subsequent molecular analysis (whole genome amplification (WGA) and massive parallel multigene sequencing). By a CellCelector™ screen we reidentified 97% of CellSearch ® SKBR-3 cells. Furthermore, we isolated 97% of CellSearch ® -proven patient CTCs using the CellCelector™ system. Therein, we found an almost perfect correlation of R 2 = 0.98 (Spearman's rho correlation, n = 20, p < 0.00001) between the CellSearch ® CTC count (n = 271) and the CellCelector™ detected CTCs (n = 252). Isolated CTCs were analyzed by WGA and massive parallel multigene sequencing. In total, single nucleotide polymorphisms (SNPs) could be detected in 50 genes in seven CTCs, 12 MCF-7, and 3 T47D cells, respectively. Taken together, CTC quantification via the CellCelector™ system ensures a comprehensive detection of CTCs preidentified by the CellSearch ® system. Moreover, the isolation of CTCs after CellSearch ® using the CellCelector™ system guarantees for CTC enrichment without any contaminants enabling subsequent high throughput genomic analyses on single cell level. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:125-132, 2017. © 2016 American Institute of Chemical Engineers.
Integrated workflows for spiking neuronal network simulations
Antolík, Ján; Davison, Andrew P.
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902
Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A
2017-12-01
Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.
Integrated workflows for spiking neuronal network simulations.
Antolík, Ján; Davison, Andrew P
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.
Text mining meets workflow: linking U-Compare with Taverna
Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia
2010-01-01
Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690
wft4galaxy: a workflow testing tool for galaxy.
Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi
2017-12-01
Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.
Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics
Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe
2015-01-01
Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831
Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics.
Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A; Caron, Christophe
2015-05-01
The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). contact@workflow4metabolomics.org. © The Author 2014. Published by Oxford University Press.
A practical workflow for making anatomical atlases for biological research.
Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles
2012-01-01
The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea
2018-01-01
Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.
CamBAfx: Workflow Design, Implementation and Application for Neuroimaging
Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John
2009-01-01
CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
WE-D-207-03: CT Protocols for Screening and the ACR Designated Lung Screening Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNitt-Gray, M.
2015-06-15
In the United States, Lung Cancer is responsible for more cancer deaths than the next four cancers combined. In addition, the 5 year survival rate for lung cancer patients has not improved over the past 40 to 50 years. To combat this deadly disease, in 2002 the National Cancer Institute launched a very large Randomized Control Trial called the National Lung Screening Trial (NLST). This trial would randomize subjects who had substantial risk of lung cancer (due to age and smoking history) into either a Chest X-ray arm or a low dose CT arm. In November 2010, the National Cancermore » Institute announced that the NLST had demonstrated 20% fewer lung cancer deaths among those who were screened with low-dose CT than with chest X-ray. In December 2013, the US Preventive Services Task Force recommended the use of Lung Cancer Screening using low dose CT and a little over a year later (Feb. 2015), CMS announced that Medicare would also cover Lung Cancer Screening using low dose CT. Thus private and public insurers are required to provide Lung Cancer Screening programs using CT to the appropriate population(s). The purpose of this Symposium is to inform medical physicists and prepare them to support the implementation of Lung Screening programs. This Symposium will focus on the clinical aspects of lung cancer screening, requirements of a screening registry for systematically capturing and tracking screening patients and results (such as required Medicare data elements) as well as the role of the medical physicist in screening programs, including the development of low dose CT screening protocols. Learning Objectives: To understand the clinical basis and clinical components of a lung cancer screening program, including eligibility criteria and other requirements. To understand the data collection requirements, workflow, and informatics infrastructure needed to support the tracking and reporting components of a screening program. To understand the role of the medical physicist in implementing Lung Cancer Screening protocols for CT, including utilizing resources such as the AAPM Protocols and the ACR Designated Lung Screening Center program. UCLA Department of Radiology has an Institutional research agreement with Siemens Healthcare; Dr. McNitt-Gray has been a recipient of Research Support from Siemens Healthcare in the past. Dr. Aberle has been a Member of Advisory Boards for the LUNGevity Foundation (2011-present) and Siemens Medical Solutions. (2013)« less
Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Ramachandran, R.; Lynnes, C.
2009-05-01
A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using "service casts" and "interest casts" (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH's Mining Workflow Composer and the open-source Active BPEL engine, and JPL's SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the "sociological" problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).
Choudhari, Shyamal P.; Pendleton, Kirk P.; Ramsey, Joshua D.; Blanchard, Thomas G.; Picking, William D.
2013-01-01
An important consideration in the development of subunit vaccines is loss of activity caused by physical instability of the protein. Such instability often results from suboptimal solution conditions related to pH and temperature. Excipients can help to stabilize vaccines, but it is important to screen and identify excipients that adequately contribute to stabilization of a given formulation. CagL is a protein present in strains of Helicobacter pylori that possess type IV secretion systems. It contributes to bacterial adherence via α5β1 integrin, thereby making it an attractive subunit vaccine candidate. We characterized the stability of CagL in different pH and temperature conditions using a variety of spectroscopic techniques. Stability was assessed in terms of transition temperature (Tm) with the accumulated data then incorporated into an empirical phase diagram (EPD) that provided an overview of CagL physical stability. These analyses indicated maximum CagL stability at pH 4–6 up to 40 °C in the absence of excipient. Using this EPD analysis, aggregation assays were developed to screen a panel of excipients with some found to inhibit CagL aggregation. Candidate stabilizers were selected to confirm their enhanced stabilizing effect. These analyses will help in the formulation of a stable vaccine against H. pylori. PMID:23794457
Takeuchi, Shoko; Kojima, Takashi; Hashimoto, Kentaro; Saito, Bunnai; Sumi, Hiroyuki; Ishikawa, Tomoyasu; Ikeda, Yukihiro
2015-01-01
Different crystal packing of hydrates from anhydrate crystals leads to different physical properties, such as solubility and stability. Investigation of the potential of varied hydrate formation, and understanding the stability in an anhydrous/hydrate system, are crucial to prevent an undesired transition during the manufacturing process and storage. Only one anhydrous form of T-3256336, a novel inhibitor of apoptosis (IAP) protein antagonist, was discovered during synthesis, and no hydrate form has been identified. In this study, we conducted hydrate screening such as dynamic water vapor sorption/desorption (DVS), and the slurry experiment, and characterized the solid-state properties of anhydrous/hydrate forms to determine the most desirable crystalline form for development. New hydrate forms, both mono-hydrate and hemi-hydrate forms, were discovered as a result of this hydrate screening. The characterization of two new hydrate forms was conducted, and the anhydrous form was determined to be the most desirable development form of T-3256336 in terms of solid-state stability. In addition, the stability of the anhydrous form was investigated using the water content and temperature controlled slurry experiment to obtain the desirable crystal form in the crystallization process. The water content regions of the stable phase of the desired form, the anhydrous form, were identified for the cooling crystallization process.
Improving adherence to the Epic Beacon ambulatory workflow.
Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana
2017-06-01
Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.
Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment
Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan
2016-01-01
Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058
Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.
Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher
2012-12-01
As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.
Fishbine, H.L.; Sewell, C. Jr.
1957-08-01
Negative feedback amplifiers, and particularly a negative feedback circuit which is economical on amode power consumption, are described. Basically, the disclosed circuit comprises two tetrode tubes where the output of the first tube is capacitamce coupled to the grid of the second tube, which in turn has its plate coupled to the cathode of the first tube to form a degenerative feedback circuit. Operating potential for screen of the second tube is supplied by connecting the cathode resistor of the first tube to the screen, while the screen is by-passed to the cathode of its tube for the amplified frequencies. Also, the amplifier incorporates a circuit to stabilize the transconductance of the tubes by making the grid potential of each tube interdependent on anode currents of both lubes by voltage divider circuitry.
Mensah, Mavis; Borzi, Cristina; Verri, Carla; Suatoni, Paola; Conte, Davide; Pastorino, Ugo; Orazio, Fortunato; Sozzi, Gabriella; Boeri, Mattia
2017-10-26
The development of a minimally invasive test, such as liquid biopsy, for early lung cancer detection in its preclinical phase is crucial to improve the outcome of this deadly disease. MicroRNAs (miRNAs) are tissue specific, small, non-coding RNAs regulating gene expression, which may act as extracellular messengers of biological signals derived from the cross-talk between the tumor and its surrounding microenvironment. They could thus represent ideal candidates for early detection of lung cancer. In this work, a methodological workflow for the prospective validation of a circulating miRNA test using custom made microfluidic cards and quantitative Real-Time PCR in plasma samples of volunteers enrolled in a lung cancer screening trial is proposed. In addition, since the release of hemolysis-related miRNAs and more general technical issues may affect the analysis, the quality control steps included in the standard operating procedures are also presented. The protocol is reproducible and gives reliable quantitative results; however, when using large clinical series, both pre-analytical and analytical features should be cautiously evaluated.
VSDMIP: virtual screening data management on an integrated platform
NASA Astrophysics Data System (ADS)
Gil-Redondo, Rubén; Estrada, Jorge; Morreale, Antonio; Herranz, Fernando; Sancho, Javier; Ortiz, Ángel R.
2009-03-01
A novel software (VSDMIP) for the virtual screening (VS) of chemical libraries integrated within a MySQL relational database is presented. Two main features make VSDMIP clearly distinguishable from other existing computational tools: (i) its database, which stores not only ligand information but also the results from every step in the VS process, and (ii) its modular and pluggable architecture, which allows customization of the VS stages (such as the programs used for conformer generation or docking), through the definition of a detailed workflow employing user-configurable XML files. VSDMIP, therefore, facilitates the storage and retrieval of VS results, easily adapts to the specific requirements of each method and tool used in the experiments, and allows the comparison of different VS methodologies. To validate the usefulness of VSDMIP as an automated tool for carrying out VS several experiments were run on six protein targets (acetylcholinesterase, cyclin-dependent kinase 2, coagulation factor Xa, estrogen receptor alpha, p38 MAP kinase, and neuraminidase) using nine binary (actives/inactive) test sets. The performance of several VS configurations was evaluated by means of enrichment factors and receiver operating characteristic plots.
Grazzini, Grazia; Ventura, Leonardo; Rubeca, Tiziana; Rapi, Stefano; Cellai, Filippo; Di Dia, Pietro P; Mallardi, Beatrice; Mantellini, Paola; Zappa, Marco; Castiglione, Guido
2017-07-01
Haemoglobin (Hb) stability in faecal samples is an important issue in colorectal cancer screening by the faecal immunochemical test (FIT) for Hb. This study evaluated the performance of the FIT-Hb (OC-Sensor Eiken) used in the Florence screening programme by comparing two different formulations of the buffer, both in an analytical and in a clinical setting. In the laboratory simulation, six faecal pools (three in each buffer type) were stored at different temperatures and analysed eight times in 10 replicates over 21 days. In the clinical setting, 7695 screenees returned two samples, using both the old and the new specimen collection device (SCD). In the laboratory simulation, 5 days from sample preparation with the buffer of the old SCD, the Hb concentration decreased by 40% at room temperature (25°C, range 22-28°C) and up to 60% at outside temperature (29°C, range 16-39°C), whereas with the new one, Hb concentration decreased by 10%. In the clinical setting, a higher mean Hb concentration with the new SCD compared with the old one was found (6.3 vs. 5.0 µg Hb/g faeces, respectively, P<0.001); no statistically significant difference was found in the probability of having a positive result in the two SCDs. Better Hb stability was observed with the new buffer under laboratory conditions, but no difference was found in the clinical performance. In our study, only marginal advantages arise from the new buffer. Improvements in sample stability represent a significant target in the screening setting.
Computational databases, pathway and cheminformatics tools for tuberculosis drug discovery
Ekins, Sean; Freundlich, Joel S.; Choi, Inhee; Sarker, Malabika; Talcott, Carolyn
2010-01-01
We are witnessing the growing menace of both increasing cases of drug-sensitive and drug-resistant Mycobacterium tuberculosis strains and the challenge to produce the first new tuberculosis (TB) drug in well over 40 years. The TB community, having invested in extensive high-throughput screening efforts, is faced with the question of how to optimally leverage this data in order to move from a hit to a lead to a clinical candidate and potentially a new drug. Complementing this approach, yet conducted on a much smaller scale, cheminformatic techniques have been leveraged and are herein reviewed. We suggest these computational approaches should be more optimally integrated in a workflow with experimental approaches to accelerate TB drug discovery. PMID:21129975
Profiling Changes in Histone Post-translational Modifications by Top-Down Mass Spectrometry.
Zhou, Mowei; Wu, Si; Stenoien, David L; Zhang, Zhaorui; Connolly, Lanelle; Freitag, Michael; Paša-Tolić, Ljiljana
2017-01-01
Top-down mass spectrometry is a valuable tool for understanding gene expression through characterization of combinatorial histone post-translational modifications (i.e., histone code). In this protocol, we describe a top-down workflow that employs liquid chromatography (LC) coupled to mass spectrometry (MS), for fast global profiling of changes in histone proteoforms, and apply LCMS top-down approach for comparative analysis of a wild-type and a mutant fungal species. The proteoforms exhibiting differential abundances can be subjected to further targeted studies by other MS or orthogonal (e.g., biochemical) assays. This method can be generally adapted for screening of changes in histone modifications between samples such as wild type vs. mutant or healthy vs. diseased.
Evangelista, Cláudia Carolina Silva; Guidelli, Giovanna Vieira; Borges, Gustavo; Araujo, Thais Fenz; de Souza, Tiago Alves Jorge; Neves, Ubiraci Pereira da Costa; Tunnacliffe, Alan; Pereira, Tiago Campos
2017-01-01
Abstract The molecular basis of anhydrobiosis, the state of suspended animation entered by some species during extreme desiccation, is still poorly understood despite a number of transcriptome and proteome studies. We therefore conducted functional screening by RNA interference (RNAi) for genes involved in anhydrobiosis in the holo-anhydrobiotic nematode Panagrolaimus superbus. A new method of survival analysis, based on staining, and proof-of-principle RNAi experiments confirmed a role for genes involved in oxidative stress tolerance, while a novel medium-scale RNAi workflow identified a further 40 anhydrobiosis-associated genes, including several involved in proteostasis, DNA repair and signal transduction pathways. This suggests that multiple genes contribute to anhydrobiosis in P. superbus. PMID:29111563
78 FR 22880 - Agency Information Collection Activities; Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-17
... between Health IT and Ambulatory Care Workflow Redesign.'' In accordance with the Paperwork Reduction Act... Understand the Relationship between Health IT and Ambulatory Care Workflow Redesign. The Agency for... Methods to Better Understand the Relationship between Health IT and Ambulatory Care Workflow Redesign...
Automatic segmentation and supervised learning-based selection of nuclei in cancer tissue images.
Nandy, Kaustav; Gudla, Prabhakar R; Amundsen, Ryan; Meaburn, Karen J; Misteli, Tom; Lockett, Stephen J
2012-09-01
Analysis of preferential localization of certain genes within the cell nuclei is emerging as a new technique for the diagnosis of breast cancer. Quantitation requires accurate segmentation of 100-200 cell nuclei in each tissue section to draw a statistically significant result. Thus, for large-scale analysis, manual processing is too time consuming and subjective. Fortuitously, acquired images generally contain many more nuclei than are needed for analysis. Therefore, we developed an integrated workflow that selects, following automatic segmentation, a subpopulation of accurately delineated nuclei for positioning of fluorescence in situ hybridization-labeled genes of interest. Segmentation was performed by a multistage watershed-based algorithm and screening by an artificial neural network-based pattern recognition engine. The performance of the workflow was quantified in terms of the fraction of automatically selected nuclei that were visually confirmed as well segmented and by the boundary accuracy of the well-segmented nuclei relative to a 2D dynamic programming-based reference segmentation method. Application of the method was demonstrated for discriminating normal and cancerous breast tissue sections based on the differential positioning of the HES5 gene. Automatic results agreed with manual analysis in 11 out of 14 cancers, all four normal cases, and all five noncancerous breast disease cases, thus showing the accuracy and robustness of the proposed approach. Published 2012 Wiley Periodicals, Inc.
De Paris, Renata; Frantz, Fábio A.; Norberto de Souza, Osmar; Ruiz, Duncan D. A.
2013-01-01
Molecular docking simulations of fully flexible protein receptor (FFR) models are coming of age. In our studies, an FFR model is represented by a series of different conformations derived from a molecular dynamic simulation trajectory of the receptor. For each conformation in the FFR model, a docking simulation is executed and analyzed. An important challenge is to perform virtual screening of millions of ligands using an FFR model in a sequential mode since it can become computationally very demanding. In this paper, we propose a cloud-based web environment, called web Flexible Receptor Docking Workflow (wFReDoW), which reduces the CPU time in the molecular docking simulations of FFR models to small molecules. It is based on the new workflow data pattern called self-adaptive multiple instances (P-SaMIs) and on a middleware built on Amazon EC2 instances. P-SaMI reduces the number of molecular docking simulations while the middleware speeds up the docking experiments using a High Performance Computing (HPC) environment on the cloud. The experimental results show a reduction in the total elapsed time of docking experiments and the quality of the new reduced receptor models produced by discarding the nonpromising conformations from an FFR model ruled by the P-SaMI data pattern. PMID:23691504
ACToR Chemical Structure processing using Open Source ...
ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d
Lemmon, Vance P; Jia, Yuanyuan; Shi, Yan; Holbrook, S Douglas; Bixby, John L; Buchser, William
2011-11-01
The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signaling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Documenting and integrating the experimental workflows, library data and extensive experimental results is challenging. For academic laboratories generating large data sets from experiments involving thousands of perturbagens, a Laboratory Information Management System (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with an On Demand or Software As A Service (SaaS) LIMS to ensure the quality of its experiments and workflows. The article discusses how the system was selected and integrated into the laboratory. The advantages of a SaaS based LIMS over a client-server based system are described. © 2011 Bentham Science Publishers
Vandecruys, Roger; Peeters, Jef; Verreck, Geert; Brewster, Marcus E
2007-09-05
Assessing the effect of excipients on the ability to attain and maintain supersaturation of drug-based solution may provide useful information for the design of solid formulations. Judicious selection of materials that affect either the extent or stability of supersaturating drug delivery systems may be enabling for poorly soluble drug candidates or other difficult-to-formulate compounds. The technique suggested herein is aimed at providing a screening protocol to allow preliminary assessment of these factors based on small to moderate amounts of drug substance. A series of excipients were selected that may, by various mechanisms, affect supersaturation including pharmaceutical polymers such as HMPC and PVP, surfactants such as Polysorbate 20, Cremophor RH40 and TPGS and hydrophilic cyclodextrins such as HPbetaCD. Using a co-solvent based method and 25 drug candidates, the data suggested, on the whole, that the surfactants and the selected cyclodextrin seemed to best augment the extent of supersaturation but had variable benefits as stabilizers, while the pharmaceutical polymers had useful effect on supersaturation stability but were less helpful in increasing the extent of supersaturation. Using these data, a group of simple solid dosage forms were prepared and tested in the dog for one of the drug candidates. Excipients that gave the best extent and stability for the formed supersaturated solution in the screening assay also gave the highest oral bioavailability in the dog.
Dauner, Allison L.; Gilliland, Theron C.; Mitra, Indrani; Pal, Subhamoy; Morrison, Amy C.; Hontz, Robert D.; Wu, Shuenn-Jue L.
2015-01-01
Loss of sample integrity during specimen transport can lead to false-negative diagnostic results. In an effort to improve upon the status quo, we used dengue as a model RNA virus to evaluate the stabilization of RNA and antibodies in three commercially available sample stabilization products: Whatman FTA Micro Cards (GE Healthcare Life Sciences, Pittsburgh, PA), DNAstāble Blood tubes (Biomātrica, San Diego, CA), and ViveST tubes (ViveBio, Alpharetta, GA). Both contrived and clinical dengue-positive specimens were stored on these products at ambient temperature or 37°C for up to 1 month. Antibody and viral RNA levels were measured by enzyme-linked immunosorbent assay (ELISA) and quantitative reverse transcription polymerase chain reaction (qRT-PCR) assays, respectively, and compared with frozen unloaded controls. We observed reduced RNA and antibody levels between stabilized contrived samples and frozen controls at our earliest time point, and this was particularly pronounced for the FTA cards. However, despite some time and temperature dependent loss, a 94.6–97.3% agreement was observed between stabilized clinical specimens and their frozen controls for all products. Additional considerations such as cost, sample volume, matrix, and ease of use should inform any decision to incorporate sample stabilization products into a diagnostic testing workflow. We conclude that DNAstāble Blood and ViveST tubes are useful alternatives to traditional filter paper for ambient temperature shipment of clinical specimens for downstream molecular and serological testing. PMID:25940193
Integrated modeling applications for tokamak experiments with OMFIT
NASA Astrophysics Data System (ADS)
Meneghini, O.; Smith, S. P.; Lao, L. L.; Izacard, O.; Ren, Q.; Park, J. M.; Candy, J.; Wang, Z.; Luna, C. J.; Izzo, V. A.; Grierson, B. A.; Snyder, P. B.; Holland, C.; Penna, J.; Lu, G.; Raum, P.; McCubbin, A.; Orlov, D. M.; Belli, E. A.; Ferraro, N. M.; Prater, R.; Osborne, T. H.; Turnbull, A. D.; Staebler, G. M.
2015-08-01
One modeling framework for integrated tasks (OMFIT) is a comprehensive integrated modeling framework which has been developed to enable physics codes to interact in complicated workflows, and support scientists at all stages of the modeling cycle. The OMFIT development follows a unique bottom-up approach, where the framework design and capabilities organically evolve to support progressive integration of the components that are required to accomplish physics goals of increasing complexity. OMFIT provides a workflow for easily generating full kinetic equilibrium reconstructions that are constrained by magnetic and motional Stark effect measurements, and kinetic profile information that includes fast-ion pressure modeled by a transport code. It was found that magnetic measurements can be used to quantify the amount of anomalous fast-ion diffusion that is present in DIII-D discharges, and provide an estimate that is consistent with what would be needed for transport simulations to match the measured neutron rates. OMFIT was used to streamline edge-stability analyses, and evaluate the effect of resonant magnetic perturbation (RMP) on the pedestal stability, which have been found to be consistent with the experimental observations. The development of a five-dimensional numerical fluid model for estimating the effects of the interaction between magnetohydrodynamic (MHD) and microturbulence, and its systematic verification against analytic models was also supported by the framework. OMFIT was used for optimizing an innovative high-harmonic fast wave system proposed for DIII-D. For a parallel refractive index {{n}\\parallel}>3 , the conditions for strong electron-Landau damping were found to be independent of launched {{n}\\parallel} and poloidal angle. OMFIT has been the platform of choice for developing a neural-network based approach to efficiently perform a non-linear multivariate regression of local transport fluxes as a function of local dimensionless parameters. Transport predictions for thousands of DIII-D discharges showed excellent agreement with the power balance calculations across the whole plasma radius and over a broad range of operating regimes. Concerning predictive transport simulations, the framework made possible the design and automation of a workflow that enables self-consistent predictions of kinetic profiles and the plasma equilibrium. It is found that the feedback between the transport fluxes and plasma equilibrium can significantly affect the kinetic profiles predictions. Such a rich set of results provide tangible evidence of how bottom-up approaches can potentially provide a fast track to integrated modeling solutions that are functional, cost-effective, and in sync with the research effort of the community.
NASA Astrophysics Data System (ADS)
Zhao, Shujiang; Li, Shuping; Liu, Huihui; Zhao, Qian; Wang, Jieyou; Yan, Maocang
2012-09-01
Seventy-eight marine fungal strains were isolated from sediment samples collected off the coast of Nanji Island, Wenzhou, Zhejiang Province, China. Antibacterial screening using the agar disc method showed that 19 of the isolated strains could inhibit at least one pathogenic V ibrio from P seudosciaena crocea. Subsequent screening confirmed that nine strains produced antibacterial metabolites that had activity against one or several types of pathogenic V ibrio. Strain NJ0104 had the widest antimicrobial spectrum and strong activity, particularly against Vibrio parahaemolyticus-MM0810072. A preliminary study of NJ0104 antibacterial metabolites demonstrated that they had thermal stability up to 80°C, ultraviolet stability up to 40 min and pH stability between 4.0-7.0. In addition, the antibacterial metabolites were readily soluble in butanol. To identify the specific strain, the ITS-5.8S rDNA regions of NJ0104 were PCR amplified and sequenced. Based on the combination of phenotypic and genotypic data, the strain was identified as Arthrinium sp.
NASA Astrophysics Data System (ADS)
Lee, Joohwi; Ikeda, Yuji; Tanaka, Isao
2017-11-01
Martensitic transformation with good structural compatibility between parent and martensitic phases are required for shape memory alloys (SMAs) in terms of functional stability. In this study, first-principles-based materials screening is systematically performed to investigate the intermetallic compounds with the martensitic phases by focusing on energetic and dynamical stabilities as well as structural compatibility with the parent phase. The B2, D03, and L21 crystal structures are considered as the parent phases, and the 2H and 6M structures are considered as the martensitic phases. In total, 3384 binary and 3243 ternary alloys with stoichiometric composition ratios are investigated. It is found that 187 alloys survive after the screening. Some of the surviving alloys are constituted by the chemical elements already widely used in SMAs, but other various metallic elements are also found in the surviving alloys. The energetic stability of the surviving alloys is further analyzed by comparison with the data in Materials Project Database (MPD) to examine the alloys whose martensitic structures may cause further phase separation or transition to the other structures.
Sport participation, screen time, and personality trait development during childhood.
Allen, Mark S; Vella, Stewart A; Laborde, Sylvain
2015-09-01
This investigation explored the contribution of extracurricular sport and screen time viewing (television viewing and electronic gaming) to personality trait stability and change during childhood. Two independent samples of 3,956 young children (age 6) and 3,862 older children (age 10) were taken from the Longitudinal Study of Australian Children. Parent-reported child sport participation, screen time, and personality traits were measured at baseline and again 24 months later. Young children who were more active recorded more of a decrease in introversion, less of a decrease in persistence, and less of an increase in reactivity, than those who were less active. Older children who were more active recorded less of an increase in introversion and more of an increase in persistence than those who were less active. In addition, young children who continued participation in extracurricular sport had greater intra-individual stability of personality for introversion. These finding suggest that an active lifestyle might help to facilitate desirable personality trait stability and change during childhood. © 2015 The British Psychological Society.
Lead-free Halide Perovskites via Functionality-directed Materials Screening
NASA Astrophysics Data System (ADS)
Zhang, Lijun; Yang, Dongwen; Lv, Jian; Zhao, Xingang; Yang, Ji-Hui; Yu, Liping; Wei, Su-Huai; Zunger, Alex
Hybrid organic-inorganic halide perovskites with the prototype material of CH3NH3PbI3 have recently attracted much interest as low-cost and high-performance photovoltaic absorbers but one would like to improve their stability and get rid of toxic Pb. We used photovoltaic-functionality-directed materials screening approach to rationally design via first-principles DFT calculations Pb-free halide perovskites. Screening criteria involve thermodynamic and crystallographic stability, as well as solar band gaps, light carrier effective masses, exciton binding, etc. We considered both single atomic substitutions in AMX3 normal perovskites (altering chemical constituents of A, M and X individually) as well as double substitution of 2M into B+C in A2BCX6 double-perovskites. Chemical trends in phase stabilities and optoelectronic properties are discussed with some promising cases exhibiting solar cell efficiencies comparable to that of CH3NH3PbI3. L.Z. founded by Recruitment Program of Global Youth Experts and National Key Research and Development Program of China, and A.Z. by DOE EERE Sun Shot of USA.
Military Interoperable Digital Hospital Testbed (MIDHT)
2015-12-01
and Analyze the resulting technological impact on medication errors, pharmacists ’ productivity, nurse satisfactions/workflow and patient...medication errors, pharmacists productivity, nurse satisfactions/workflow and patient satisfaction. 1.1.1 Pharmacy Robotics Implementation...1.2 Research and analyze the resulting technological impact on medication errors, pharmacist productivity, nurse satisfaction/workflow and patient
Provenance Storage, Querying, and Visualization in PBase
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kianmajd, Parisa; Ludascher, Bertram; Missier, Paolo
2015-01-01
We present PBase, a repository for scientific workflows and their corresponding provenance information that facilitates the sharing of experiments among the scientific community. PBase is interoperable since it uses ProvONE, a standard provenance model for scientific workflows. Workflows and traces are stored in RDF, and with the support of SPARQL and the tree cover encoding, the repository provides a scalable infrastructure for querying the provenance data. Furthermore, through its user interface, it is possible to: visualize workflows and execution traces; visualize reachability relations within these traces; issue SPARQL queries; and visualize query results.
Context-aware workflow management of mobile health applications.
Salden, Alfons; Poortinga, Remco
2006-01-01
We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.
Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...
2016-10-06
The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less
Prototype of Kepler Processing Workflows For Microscopy And Neuroinformatics
Astakhov, V.; Bandrowski, A.; Gupta, A.; Kulungowski, A.W.; Grethe, J.S.; Bouwer, J.; Molina, T.; Rowley, V.; Penticoff, S.; Terada, M.; Wong, W.; Hakozaki, H.; Kwon, O.; Martone, M.E.; Ellisman, M.
2016-01-01
We report on progress of employing the Kepler workflow engine to prototype “end-to-end” application integration workflows that concern data coming from microscopes deployed at the National Center for Microscopy Imaging Research (NCMIR). This system is built upon the mature code base of the Cell Centered Database (CCDB) and integrated rule-oriented data system (IRODS) for distributed storage. It provides integration with external projects such as the Whole Brain Catalog (WBC) and Neuroscience Information Framework (NIF), which benefit from NCMIR data. We also report on specific workflows which spawn from main workflows and perform data fusion and orchestration of Web services specific for the NIF project. This “Brain data flow” presents a user with categorized information about sources that have information on various brain regions. PMID:28479932
Workflow technology: the new frontier. How to overcome the barriers and join the future.
Shefter, Susan M
2006-01-01
Hospitals are catching up to the business world in the introduction of technology systems that support professional practice and workflow. The field of case management is highly complex and interrelates with diverse groups in diverse locations. The last few years have seen the introduction of Workflow Technology Tools, which can improve the quality and efficiency of discharge planning by the case manager. Despite the availability of these wonderful new programs, many case managers are hesitant to adopt the new technology and workflow. For a myriad of reasons, a computer-based workflow system can seem like a brick wall. This article discusses, from a practitioner's point of view, how professionals can gain confidence and skill to get around the brick wall and join the future.
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
Workflow and Electronic Health Records in Small Medical Practices
Ramaiah, Mala; Subrahmanian, Eswaran; Sriram, Ram D; Lide, Bettijoyce B
2012-01-01
This paper analyzes the workflow and implementation of electronic health record (EHR) systems across different functions in small physician offices. We characterize the differences in the offices based on the levels of computerization in terms of workflow, sources of time delay, and barriers to using EHR systems to support the entire workflow. The study was based on a combination of questionnaires, interviews, in situ observations, and data collection efforts. This study was not intended to be a full-scale time-and-motion study with precise measurements but was intended to provide an overview of the potential sources of delays while performing office tasks. The study follows an interpretive model of case studies rather than a large-sample statistical survey of practices. To identify time-consuming tasks, workflow maps were created based on the aggregated data from the offices. The results from the study show that specialty physicians are more favorable toward adopting EHR systems than primary care physicians are. The barriers to adoption of EHR systems by primary care physicians can be attributed to the complex workflows that exist in primary care physician offices, leading to nonstandardized workflow structures and practices. Also, primary care physicians would benefit more from EHR systems if the systems could interact with external entities. PMID:22737096
Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M
2016-01-01
Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.
Corrosion testing of candidates for the alkaline fuel cell cathode
NASA Technical Reports Server (NTRS)
Singer, Joseph; Fielder, William L.
1989-01-01
Current/voltage data was obtained for specially made corrosion electrodes of some oxides and of gold materials for the purpose of developing a screening test of catalysts and supports for use at the cathode of the alkaline fuel cell. The data consists of measurements of current at fixed potentials and cyclic voltammograms. These data will have to be correlated with longtime performance data in order to fully evaluate this approach to corrosion screening. Corrosion test screening of candidates for the oxygen reduction electrode of the alkaline fuel cell was applied to two substances, the pyrochlore Pb2Ru2O6.5 and the spinel NiCo2O4. The substrate gold screen and a sample of the IFC Orbiter Pt-Au performance electrode were included as blanks. The pyrochlore data indicate relative stability, although nothing yet can be said about long term stability. The spinel was plainly unstable. For this type of testing to be validated, comparisons will have to be made with long term performance tests.
Krist, Alex H; Woolf, Steven H; Hochheimer, Camille; Sabo, Roy T; Kashiri, Paulette; Jones, Resa M; Lafata, Jennifer Elston; Etz, Rebecca S; Tu, Shin-Ping
2017-05-01
Technology could transform routine decision making by anticipating patients' information needs, assessing where patients are with decisions and preferences, personalizing educational experiences, facilitating patient-clinician information exchange, and supporting follow-up. This study evaluated whether patients and clinicians will use such a decision module and its impact on care, using 3 cancer screening decisions as test cases. Twelve practices with 55,453 patients using a patient portal participated in this prospective observational cohort study. Participation was open to patients who might face a cancer screening decision: women aged 40 to 49 who had not had a mammogram in 2 years, men aged 55 to 69 who had not had a prostate-specific antigen test in 2 years, and adults aged 50 to 74 overdue for colorectal cancer screening. Data sources included module responses, electronic health record data, and a postencounter survey. In 1 year, one-fifth of the portal users (11,458 patients) faced a potential cancer screening decision. Among these patients, 20.6% started and 7.9% completed the decision module. Fully 47.2% of module completers shared responses with their clinician. After their next office visit, 57.8% of those surveyed thought their clinician had seen their responses, and many reported the module made their appointment more productive (40.7%), helped engage them in the decision (47.7%), broadened their knowledge (48.1%), and improved communication (37.5%). Many patients face decisions that can be anticipated and proactively facilitated through technology. Although use of technology has the potential to make visits more efficient and effective, cultural, workflow, and technical changes are needed before it could be widely disseminated. © 2017 Annals of Family Medicine, Inc.
A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking
Warmus, Holly R.; Schaffner, Erin K.; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M.
2018-01-01
Background: Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. Objectives: We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children’s hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods: Methods are organized in a plan-do-study-act cycle. During the “plan” phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the “do” phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Results: Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45–8.2% and mortality range of 8.2–25% (Table 2).1–5 Conclusions/Implications: Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement. PMID:29732457
NASA Astrophysics Data System (ADS)
Ferreira da Silva, R.; Filgueira, R.; Deelman, E.; Atkinson, M.
2016-12-01
We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus. Asterism is freely available online at http://github.com/dispel4py/pegasus_dispel4py.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, J; Wang, J; Peng, J
Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynch, Vickie E.; Borreguero, Jose M.; Bhowmik, Debsindhu
Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parametersmore » which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.« less
Decaf: Decoupled Dataflows for In Situ High-Performance Workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreher, M.; Peterka, T.
Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steeringmore » based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.« less
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.
2017-12-01
This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.
Structured recording of intraoperative surgical workflows
NASA Astrophysics Data System (ADS)
Neumuth, T.; Durstewitz, N.; Fischer, M.; Strauss, G.; Dietz, A.; Meixensberger, J.; Jannin, P.; Cleary, K.; Lemke, H. U.; Burgert, O.
2006-03-01
Surgical Workflows are used for the methodical and scientific analysis of surgical interventions. The approach described here is a step towards developing surgical assist systems based on Surgical Workflows and integrated control systems for the operating room of the future. This paper describes concepts and technologies for the acquisition of Surgical Workflows by monitoring surgical interventions and their presentation. Establishing systems which support the Surgical Workflow in operating rooms requires a multi-staged development process beginning with the description of these workflows. A formalized description of surgical interventions is needed to create a Surgical Workflow. This description can be used to analyze and evaluate surgical interventions in detail. We discuss the subdivision of surgical interventions into work steps regarding different levels of granularity and propose a recording scheme for the acquisition of manual surgical work steps from running interventions. To support the recording process during the intervention, we introduce a new software architecture. Core of the architecture is our Surgical Workflow editor that is intended to deal with the manifold, complex and concurrent relations during an intervention. Furthermore, a method for an automatic generation of graphs is shown which is able to display the recorded surgical work steps of the interventions. Finally we conclude with considerations about extensions of our recording scheme to close the gap to S-PACS systems. The approach was used to record 83 surgical interventions from 6 intervention types from 3 different surgical disciplines: ENT surgery, neurosurgery and interventional radiology. The interventions were recorded at the University Hospital Leipzig, Germany and at the Georgetown University Hospital, Washington, D.C., USA.
NASA Astrophysics Data System (ADS)
Kalinin, Sergei V.; Kim, Yunseok; Fong, Dillon D.; Morozovska, Anna N.
2018-03-01
For over 70 years, ferroelectric materials have been one of the central research topics for condensed matter physics and material science, an interest driven both by fundamental science and applications. However, ferroelectric surfaces, the key component of ferroelectric films and nanostructures, still present a significant theoretical and even conceptual challenge. Indeed, stability of ferroelectric phase per se necessitates screening of polarization charge. At surfaces, this can lead to coupling between ferroelectric and semiconducting properties of material, or with surface (electro) chemistry, going well beyond classical models applicable for ferroelectric interfaces. In this review, we summarize recent studies of surface-screening phenomena in ferroelectrics. We provide a brief overview of the historical understanding of the physics of ferroelectric surfaces, and existing theoretical models that both introduce screening mechanisms and explore the relationship between screening and relevant aspects of ferroelectric functionalities starting from phase stability itself. Given that the majority of ferroelectrics exist in multiple-domain states, we focus on local studies of screening phenomena using scanning probe microscopy techniques. We discuss recent studies of static and dynamic phenomena on ferroelectric surfaces, as well as phenomena observed under lateral transport, light, chemical, and pressure stimuli. We also note that the need for ionic screening renders polarization switching a coupled physical–electrochemical process and discuss the non-trivial phenomena such as chaotic behavior during domain switching that stem from this. ).
Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Ramachandran, R.; Lynnes, C.
2009-12-01
A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using “service casts” and “interest casts” (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH’s Mining Workflow Composer and the open-source Active BPEL engine, and JPL’s SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the “sociological” problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).
Extension of specification language for soundness and completeness of service workflow
NASA Astrophysics Data System (ADS)
Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn
2018-05-01
A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.
Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J
2015-01-01
Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169
Optimizing high performance computing workflow for protein functional annotation.
Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene
2014-09-10
Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.
Optimizing high performance computing workflow for protein functional annotation
Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene
2014-01-01
Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296
Standardized assessment of infrared thermographic fever screening system performance
NASA Astrophysics Data System (ADS)
Ghassemi, Pejhman; Pfefer, Joshua; Casamento, Jon; Wang, Quanzeng
2017-03-01
Thermal modalities represent the only currently viable mass fever screening approach for outbreaks of infectious disease pandemics such as Ebola and SARS. Non-contact infrared thermometers (NCITs) and infrared thermographs (IRTs) have been previously used for mass fever screening in transportation hubs such as airports to reduce the spread of disease. While NCITs remain a more popular choice for fever screening in the field and at fixed locations, there has been increasing evidence in the literature that IRTs can provide greater accuracy in estimating core body temperature if appropriate measurement practices are applied - including the use of technically suitable thermographs. Therefore, the purpose of this study was to develop a battery of evaluation test methods for standardized, objective and quantitative assessment of thermograph performance characteristics critical to assessing suitability for clinical use. These factors include stability, drift, uniformity, minimum resolvable temperature difference, and accuracy. Two commercial IRT models were characterized. An external temperature reference source with high temperature accuracy was utilized as part of the screening thermograph. Results showed that both IRTs are relatively accurate and stable (<1% error of reading with stability of +/-0.05°C). Overall, results of this study may facilitate development of standardized consensus test methods to enable consistent and accurate use of IRTs for fever screening.
COSMOS: Python library for massively parallel workflows
Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.
2014-01-01
Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428
OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.
Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A
2016-01-01
High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.
OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid
Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.
2016-01-01
High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
COSMOS: Python library for massively parallel workflows.
Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J
2014-10-15
Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Niazkhani, Zahra; Pirnejad, Habibollah; Berg, Marc; Aarts, Jos
2009-01-01
Previous studies have shown the importance of workflow issues in the implementation of CPOE systems and patient safety practices. To understand the impact of CPOE on clinical workflow, we developed a conceptual framework and conducted a literature search for CPOE evaluations between 1990 and June 2007. Fifty-one publications were identified that disclosed mixed effects of CPOE systems. Among the frequently reported workflow advantages were the legible orders, remote accessibility of the systems, and the shorter order turnaround times. Among the frequently reported disadvantages were the time-consuming and problematic user-system interactions, and the enforcement of a predefined relationship between clinical tasks and between providers. Regarding the diversity of findings in the literature, we conclude that more multi-method research is needed to explore CPOE's multidimensional and collective impact on especially collaborative workflow.
Load-sensitive dynamic workflow re-orchestration and optimisation for faster patient healthcare.
Meli, Christopher L; Khalil, Ibrahim; Tari, Zahir
2014-01-01
Hospital waiting times are considerably long, with no signs of reducing any-time soon. A number of factors including population growth, the ageing population and a lack of new infrastructure are expected to further exacerbate waiting times in the near future. In this work, we show how healthcare services can be modelled as queueing nodes, together with healthcare service workflows, such that these workflows can be optimised during execution in order to reduce patient waiting times. Services such as X-ray, computer tomography, and magnetic resonance imaging often form queues, thus, by taking into account the waiting times of each service, the workflow can be re-orchestrated and optimised. Experimental results indicate average waiting time reductions are achievable by optimising workflows using dynamic re-orchestration. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.
Scientific workflows as productivity tools for drug discovery.
Shon, John; Ohkawa, Hitomi; Hammer, Juergen
2008-05-01
Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149
Optimization of tomographic reconstruction workflows on geographically distributed resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission
NASA Technical Reports Server (NTRS)
Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan
2010-01-01
The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.
NASA Astrophysics Data System (ADS)
Tomlin, M. C.; Jenkyns, R.
2015-12-01
Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.
Legrain, Fleur; Carrete, Jesús; van Roekeghem, Ambroise; Madsen, Georg K H; Mingo, Natalio
2018-01-18
Machine learning (ML) is increasingly becoming a helpful tool in the search for novel functional compounds. Here we use classification via random forests to predict the stability of half-Heusler (HH) compounds, using only experimentally reported compounds as a training set. Cross-validation yields an excellent agreement between the fraction of compounds classified as stable and the actual fraction of truly stable compounds in the ICSD. The ML model is then employed to screen 71 178 different 1:1:1 compositions, yielding 481 likely stable candidates. The predicted stability of HH compounds from three previous high-throughput ab initio studies is critically analyzed from the perspective of the alternative ML approach. The incomplete consistency among the three separate ab initio studies and between them and the ML predictions suggests that additional factors beyond those considered by ab initio phase stability calculations might be determinant to the stability of the compounds. Such factors can include configurational entropies and quasiharmonic contributions.
Schlesinger, Joseph J; Burdick, Kendall; Baum, Sarah; Bellomy, Melissa; Mueller, Dorothee; MacDonald, Alistair; Chern, Alex; Chrouser, Kristin; Burger, Christie
2018-03-01
The concept of clinical workflow borrows from management and leadership principles outside of medicine. The only way to rethink clinical workflow is to understand the neuroscience principles that underlie attention and vigilance. With any implementation to improve practice, there are human factors that can promote or impede progress. Modulating the environment and working as a team to take care of patients is paramount. Clinicians must continually rethink clinical workflow, evaluate progress, and understand that other industries have something to offer. Then, novel approaches can be implemented to take the best care of patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Mirel, Barbara; Eichinger, Felix; Keller, Benjamin J; Kretzler, Matthias
2011-03-21
Bioinformatics visualization tools are often not robust enough to support biomedical specialists’ complex exploratory analyses. Tools need to accommodate the workflows that scientists actually perform for specific translational research questions. To understand and model one of these workflows, we conducted a case-based, cognitive task analysis of a biomedical specialist’s exploratory workflow for the question: What functional interactions among gene products of high throughput expression data suggest previously unknown mechanisms of a disease? From our cognitive task analysis four complementary representations of the targeted workflow were developed. They include: usage scenarios, flow diagrams, a cognitive task taxonomy, and a mapping between cognitive tasks and user-centered visualization requirements. The representations capture the flows of cognitive tasks that led a biomedical specialist to inferences critical to hypothesizing. We created representations at levels of detail that could strategically guide visualization development, and we confirmed this by making a trial prototype based on user requirements for a small portion of the workflow. Our results imply that visualizations should make available to scientific users “bundles of features†consonant with the compositional cognitive tasks purposefully enacted at specific points in the workflow. We also highlight certain aspects of visualizations that: (a) need more built-in flexibility; (b) are critical for negotiating meaning; and (c) are necessary for essential metacognitive support.
Managing and Communicating Operational Workflow
Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.
2016-01-01
Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407
Feliubadaló, Lídia; Lopez-Doriga, Adriana; Castellsagué, Ester; del Valle, Jesús; Menéndez, Mireia; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Gómez, Carolina; Campos, Olga; Pineda, Marta; González, Sara; Moreno, Victor; Brunet, Joan; Blanco, Ignacio; Serra, Eduard; Capellá, Gabriel; Lázaro, Conxi
2013-01-01
Next-generation sequencing (NGS) is changing genetic diagnosis due to its huge sequencing capacity and cost-effectiveness. The aim of this study was to develop an NGS-based workflow for routine diagnostics for hereditary breast and ovarian cancer syndrome (HBOCS), to improve genetic testing for BRCA1 and BRCA2. A NGS-based workflow was designed using BRCA MASTR kit amplicon libraries followed by GS Junior pyrosequencing. Data analysis combined Variant Identification Pipeline freely available software and ad hoc R scripts, including a cascade of filters to generate coverage and variant calling reports. A BRCA homopolymer assay was performed in parallel. A research scheme was designed in two parts. A Training Set of 28 DNA samples containing 23 unique pathogenic mutations and 213 other variants (33 unique) was used. The workflow was validated in a set of 14 samples from HBOCS families in parallel with the current diagnostic workflow (Validation Set). The NGS-based workflow developed permitted the identification of all pathogenic mutations and genetic variants, including those located in or close to homopolymers. The use of NGS for detecting copy-number alterations was also investigated. The workflow meets the sensitivity and specificity requirements for the genetic diagnosis of HBOCS and improves on the cost-effectiveness of current approaches. PMID:23249957
Rare earth phosphors and phosphor screens
Buchanan, Robert A.; Maple, T. Grant; Sklensky, Alden F.
1981-01-01
This invention relates to rare earth phosphor screens for converting image carrying incident radiation to image carrying visible or near-visible radiation and to the rare earth phosphor materials utilized in such screens. The invention further relates to methods for converting image carrying charged particles to image carrying radiation principally in the blue and near-ultraviolet region of the spectrum and to stabilized rare earth phosphors characterized by having a continuous surface layer of the phosphors of the invention. More particularly, the phosphors of the invention are oxychlorides and oxybromides of yttrium, lanthanum and gadolinium activated with trivalent cerium and the conversion screens are of the type illustratively including x-ray conversion screens, image amplifier tube screens, neutron imaging screens, cathode ray tube screens, high energy gamma ray screens, scintillation detector screens and screens for real-time translation of image carrying high energy radiation to image carrying visible or near-visible radiation.
Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores
NASA Astrophysics Data System (ADS)
Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas
2017-10-01
We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.
Web-video-mining-supported workflow modeling for laparoscopic surgeries.
Liu, Rui; Zhang, Xiaoli; Zhang, Hao
2016-11-01
As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.
Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh
2017-11-01
This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Widening the adoption of workflows to include human and human-machine scientific processes
NASA Astrophysics Data System (ADS)
Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.
2010-12-01
Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.
Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E
To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
TB Mobile: a mobile app for anti-tuberculosis molecules with known targets
2013-01-01
Background An increasing number of researchers are focused on strategies for developing inhibitors of Mycobacterium tuberculosis (Mtb) as tuberculosis (TB) drugs. Results In order to learn from prior work we have collated information on molecules screened versus Mtb and their targets which has been made available in the Collaborative Drug Discovery (CDD) database. This dataset contains published data on target, essentiality, links to PubMed, TBDB, TBCyc (which provides a pathway-based visualization of the entire cellular biochemical network) and human homolog information. The development of mobile cheminformatics apps could lower the barrier to drug discovery and promote collaboration. Therefore we have used this set of over 700 molecules screened versus Mtb and their targets to create a free mobile app (TB Mobile) that displays molecule structures and links to the bioinformatics data. By input of a molecular structures and performing a similarity search within the app we can infer potential targets or search by targets to retrieve compounds known to be active. Conclusions TB Mobile may assist researchers as part of their workflow in identifying potential targets for hits generated from phenotypic screening and in prioritizing them for further follow-up. The app is designed to lower the barriers to accessing this information, so that all researchers with an interest in combatting this deadly disease can use it freely to the benefit of their own efforts. PMID:23497706
NASA Astrophysics Data System (ADS)
Maffucci, Irene; Hu, Xiao; Fumagalli, Valentina; Contini, Alessandro
2018-03-01
Nwat-MMGBSA is a variant of MM-PB/GBSA based on the inclusion of a number of explicit water molecules that are the closest to the ligand in each frame of a molecular dynamics trajectory. This method demonstrated improved correlations between calculated and experimental binding energies in both protein-protein interactions and ligand-receptor complexes, in comparison to the standard MM-GBSA. A protocol optimization, aimed to maximize efficacy and efficiency, is discussed here considering penicillopepsin, HIV1-protease, and BCL-XL as test cases. Calculations were performed in triplicates on both classic HPC environments and on standard workstations equipped by a GPU card, evidencing no statistical differences in the results. No relevant differences in correlation to experiments were also observed when performing Nwat-MMGBSA calculations on 4 ns or 1 ns long trajectories. A fully automatic workflow for structure-based virtual screening, performing from library set-up to docking and Nwat-MMGBSA rescoring, has then been developed. The protocol has been tested against no rescoring or standard MM-GBSA rescoring within a retrospective virtual screening of inhibitors of AmpC β-lactamase and of the Rac1-Tiam1 protein-protein interaction. In both cases, Nwat-MMGBSA rescoring provided a statistically significant increase in the ROC AUCs of between 20% and 30%, compared to docking scoring or to standard MM-GBSA rescoring.
Modern approaches to accelerate discovery of new antischistosomal drugs.
Neves, Bruno Junior; Muratov, Eugene; Machado, Renato Beilner; Andrade, Carolina Horta; Cravo, Pedro Vitor Lemos
2016-06-01
The almost exclusive use of only praziquantel for the treatment of schistosomiasis has raised concerns about the possible emergence of drug-resistant schistosomes. Consequently, there is an urgent need for new antischistosomal drugs. The identification of leads and the generation of high quality data are crucial steps in the early stages of schistosome drug discovery projects. Herein, the authors focus on the current developments in antischistosomal lead discovery, specifically referring to the use of automated in vitro target-based and whole-organism screens and virtual screening of chemical databases. They highlight the strengths and pitfalls of each of the above-mentioned approaches, and suggest possible roadmaps towards the integration of several strategies, which may contribute for optimizing research outputs and led to more successful and cost-effective drug discovery endeavors. Increasing partnerships and access to funding for drug discovery have strengthened the battle against schistosomiasis in recent years. However, the authors believe this battle also includes innovative strategies to overcome scientific challenges. In this context, significant advances of in vitro screening as well as computer-aided drug discovery have contributed to increase the success rate and reduce the costs of drug discovery campaigns. Although some of these approaches were already used in current antischistosomal lead discovery pipelines, the integration of these strategies in a solid workflow should allow the production of new treatments for schistosomiasis in the near future.
Green, Beverly B; Anderson, Melissa L; Chubak, Jessica; Baldwin, Laura Mae; Tuzzio, Leah; Catz, Sheryl; Cole, Alison; Vernon, Sally W
2016-01-01
The patient-centered medical home (PCMH) includes comprehensive chronic illness and preventive services, including identifying patients who are overdue for colorectal cancer screening (CRCS). The association between PCMH implementation and CRCS during the Systems of Support to Increase Colorectal Cancer Screening Trial (SOS) is described. The SOS enrolled 4664 patients from 21 clinics from August 2008 to November 2009. Patients were randomized to usual care, mailed fecal kits, kits plus brief assistance, or kits plus assistance and navigation. A PCMH model that included a workflow for facilitating CRCS was implemented at all study clinics in late 2009. Patients enrolled early had little exposure to the PCMH, whereas patients enrolled later were exposed during most of their first year in the trial. Logistic regression models were used to assess the association between PCMH exposure and CRCS. Usual care patients with ≥8 months in the PCMH had higher CRCS rates than those with ≤4 months in the PCMH (adjusted difference, 10.1%; 95% confidence interval, 5.7-14.6). SOS interventions led to significant increases in CRCS, but the magnitude of effect was attenuated by exposure to the PCMH (P for interaction = .01). Exposure to a PCMH was associated with higher CRCS rates. Automated mailed and centrally delivered stepped interventions increased CRCS rates, even in the presence of a PCMH. © Copyright 2016 by the American Board of Family Medicine.
Chen, Wen-Chi; Park, Sung-Hyun; Hoffman, Carol; Philip, Cecil; Robinson, Linda; West, James; Grunig, Gabriele
2013-01-16
The function of the right heart is to pump blood through the lungs, thus linking right heart physiology and pulmonary vascular physiology. Inflammation is a common modifier of heart and lung function, by elaborating cellular infiltration, production of cytokines and growth factors, and by initiating remodeling processes. Compared to the left ventricle, the right ventricle is a low-pressure pump that operates in a relatively narrow zone of pressure changes. Increased pulmonary artery pressures are associated with increased pressure in the lung vascular bed and pulmonary hypertension. Pulmonary hypertension is often associated with inflammatory lung diseases, for example chronic obstructive pulmonary disease, or autoimmune diseases. Because pulmonary hypertension confers a bad prognosis for quality of life and life expectancy, much research is directed towards understanding the mechanisms that might be targets for pharmaceutical intervention. The main challenge for the development of effective management tools for pulmonary hypertension remains the complexity of the simultaneous understanding of molecular and cellular changes in the right heart, the lungs and the immune system. Here, we present a procedural workflow for the rapid and precise measurement of pressure changes in the right heart of mice and the simultaneous harvest of samples from heart, lungs and immune tissues. The method is based on the direct catheterization of the right ventricle via the jugular vein in close-chested mice, first developed in the late 1990s as surrogate measure of pressures in the pulmonary artery. The organized team-approach facilitates a very rapid right heart catheterization technique. This makes it possible to perform the measurements in mice that spontaneously breathe room air. The organization of the work-flow in distinct work-areas reduces time delay and opens the possibility to simultaneously perform physiology experiments and harvest immune, heart and lung tissues. The procedural workflow outlined here can be adapted for a wide variety of laboratory settings and study designs, from small, targeted experiments, to large drug screening assays. The simultaneous acquisition of cardiac physiology data that can be expanded to include echocardiography and harvest of heart, lung and immune tissues reduces the number of animals needed to obtain data that move the scientific knowledge basis forward. The procedural workflow presented here also provides an ideal basis for gaining knowledge of the networks that link immune, lung and heart function. The same principles outlined here can be adapted to study other or additional organs as needed.
Knowledge Extraction and Semantic Annotation of Text from the Encyclopedia of Life
Thessen, Anne E.; Parr, Cynthia Sims
2014-01-01
Numerous digitization and ontological initiatives have focused on translating biological knowledge from narrative text to machine-readable formats. In this paper, we describe two workflows for knowledge extraction and semantic annotation of text data objects featured in an online biodiversity aggregator, the Encyclopedia of Life. One workflow tags text with DBpedia URIs based on keywords. Another workflow finds taxon names in text using GNRD for the purpose of building a species association network. Both workflows work well: the annotation workflow has an F1 Score of 0.941 and the association algorithm has an F1 Score of 0.885. Existing text annotators such as Terminizer and DBpedia Spotlight performed well, but require some optimization to be useful in the ecology and evolution domain. Important future work includes scaling up and improving accuracy through the use of distributional semantics. PMID:24594988
A framework for service enterprise workflow simulation with multi-agents cooperation
NASA Astrophysics Data System (ADS)
Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun
2013-11-01
Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.
Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias
2016-11-01
Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS 2 experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10 5 in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.
Patel, Preeti; Singh, Avineesh; Patel, Vijay K; Jain, Deepak K; Veerasamy, Ravichandran; Rajak, Harish
2016-01-01
Histone deacetylase (HDAC) inhibitors can reactivate gene expression and inhibit the growth and survival of cancer cells. To identify the important pharmacophoric features and correlate 3Dchemical structure with biological activity using 3D-QSAR and Pharmacophore modeling studies. The pharmacophore hypotheses were developed using e-pharmacophore script and phase module. Pharmacophore hypothesis represents the 3D arrangement of molecular features necessary for activity. A series of 55 compounds with wellassigned HDAC inhibitory activity were used for 3D-QSAR model development. Best 3D-QSAR model, which is a five partial least square (PLS) factor model with good statistics and predictive ability, acquired Q2 (0.7293), R2 (0.9811), cross-validated coefficient rcv 2=0.9807 and R2 pred=0.7147 with low standard deviation (0.0952). Additionally, the selected pharmacophore model DDRRR.419 was used as a 3D query for virtual screening against the ZINC database. In the virtual screening workflow, docking studies (HTVS, SP and XP) were carried out by selecting multiple receptors (PDB ID: 1T69, 1T64, 4LXZ, 4LY1, 3MAX, 2VQQ, 3C10, 1W22). Finally, six compounds were obtained based on high scoring function (dock score -11.2278-10.2222 kcal/mol) and diverse structures. The structure activity correlation was established using virtual screening, docking, energetic based pharmacophore modelling, pharmacophore, atom based 3D QSAR models and their validation. The outcomes of these studies could be further employed for the design of novel HDAC inhibitors for anticancer activity.
Automated High-Throughput Permethylation for Glycosylation Analysis of Biologics Using MALDI-TOF-MS.
Shubhakar, Archana; Kozak, Radoslaw P; Reiding, Karli R; Royle, Louise; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
2016-09-06
Monitoring glycoprotein therapeutics for changes in glycosylation throughout the drug's life cycle is vital, as glycans significantly modulate the stability, biological activity, serum half-life, safety, and immunogenicity. Biopharma companies are increasingly adopting Quality by Design (QbD) frameworks for measuring, optimizing, and controlling drug glycosylation. Permethylation of glycans prior to analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a valuable tool for glycan characterization and for screening of large numbers of samples in QbD drug realization. However, the existing protocols for manual permethylation and liquid-liquid extraction (LLE) steps are labor intensive and are thus not practical for high-throughput (HT) studies. Here we present a glycan permethylation protocol, based on 96-well microplates, that has been developed into a kit suitable for HT work. The workflow is largely automated using a liquid handling robot and includes N-glycan release, enrichment of N-glycans, permethylation, and LLE. The kit has been validated according to industry analytical performance guidelines and applied to characterize biopharmaceutical samples, including IgG4 monoclonal antibodies (mAbs) and recombinant human erythropoietin (rhEPO). The HT permethylation enabled glycan characterization and relative quantitation with minimal side reactions: the MALDI-TOF-MS profiles obtained were in good agreement with hydrophilic liquid interaction chromatography (HILIC) and ultrahigh performance liquid chromatography (UHPLC) data. Automated permethylation and extraction of 96 glycan samples was achieved in less than 5 h and automated data acquisition on MALDI-TOF-MS took on average less than 1 min per sample. This automated and HT glycan preparation and permethylation showed to be convenient, fast, and reliable and can be applied for drug glycan profiling and clinical glycan biomarker studies.
Singh, Nidhi; Shah, Priyanka; Dwivedi, Hemlata; Mishra, Shikha; Tripathi, Renu; Sahasrabuddhe, Amogh A; Siddiqi, Mohammad Imran
2016-11-15
N-Myristoyltransferase (NMT) catalyzes the transfer of myristate to the amino-terminal glycine of a subset of proteins, a co-translational modification involved in trafficking substrate proteins to membrane locations, stabilization and protein-protein interactions. It is a studied and validated pre-clinical drug target for fungal and parasitic infections. In the present study, a machine learning approach, docking studies and CoMFA analysis have been integrated with the objective of translation of knowledge into a pipelined workflow towards the identification of putative hits through the screening of large compound libraries. In the proposed pipeline, the reported parasitic NMT inhibitors have been used to develop predictive machine learning classification models. Simultaneously, a TbNMT complex model was generated to establish the relationship between the binding mode of the inhibitors for LmNMT and TbNMT through molecular dynamics simulation studies. A 3D-QSAR model was developed and used to predict the activity of the proposed hits in the subsequent step. The hits classified as active based on the machine learning model were assessed as the potential anti-trypanosomal NMT inhibitors through molecular docking studies, predicted activity using a QSAR model and visual inspection. In the final step, the proposed pipeline was validated through in vitro experiments. A total of seven hits have been proposed and tested in vitro for evaluation of dual inhibitory activity against Leishmania donovani and Trypanosoma brucei. Out of these five compounds showed significant inhibition against both of the organisms. The common topmost active compound SEW04173 belongs to a pyrazole carboxylate scaffold and is anticipated to enrich the chemical space with enhanced potency through optimization.
A Six‐Stage Workflow for Robust Application of Systems Pharmacology
Gadkar, K; Kirouac, DC; Mager, DE; van der Graaf, PH
2016-01-01
Quantitative and systems pharmacology (QSP) is increasingly being applied in pharmaceutical research and development. One factor critical to the ultimate success of QSP is the establishment of commonly accepted language, technical criteria, and workflows. We propose an integrated workflow that bridges conceptual objectives with underlying technical detail to support the execution, communication, and evaluation of QSP projects. PMID:27299936
Using Workflow Diagrams to Address Hand Hygiene in Pediatric Long-Term Care Facilities1
Carter, Eileen J.; Cohen, Bevin; Murray, Meghan T.; Saiman, Lisa; Larson, Elaine L.
2015-01-01
Hand hygiene (HH) in pediatric long-term care settings has been found to be sub-optimal. Multidisciplinary teams at three pediatric long-term care facilities developed step-by-step workflow diagrams of commonly performed tasks highlighting HH opportunities. Diagrams were validated through observation of tasks and concurrent diagram assessment. Facility teams developed six workflow diagrams that underwent 22 validation observations. Four main themes emerged: 1) diagram specificity, 2) wording and layout, 3) timing of HH indications, and 4) environmental hygiene. The development of workflow diagrams is an opportunity to identify and address the complexity of HH in pediatric long-term care facilities. PMID:25773517
High-volume workflow management in the ITN/FBI system
NASA Astrophysics Data System (ADS)
Paulson, Thomas L.
1997-02-01
The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-01-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600
Development of the workflow kine systems for support on KAIZEN.
Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro
2012-01-01
In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.
[Integration of the radiotherapy irradiation planning in the digital workflow].
Röhner, F; Schmucker, M; Henne, K; Momm, F; Bruggmoser, G; Grosu, A-L; Frommhold, H; Heinemann, F E
2013-02-01
At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority.
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achyuthan, Komandoor E.; Wheeler, David R.
Evaluating the stability of coupling reagents, quality control (QC), and surface functionalization metrology are all critical to the production of high quality peptide microarrays. We describe a broadly applicable screening technique for evaluating the fidelity of solid phase peptide synthesis (SPPS), the stability of activation/coupling reagents, and a microarray surface metrology tool. This technique was used to assess the stability of the activation reagent 1-{[1-(Cyano-2-ethoxy-2-oxo-ethylidenaminooxy)dimethylamino-morpholinomethylene]}methaneaminiumHexafluorophosphate (COMU) (Sigma-Aldrich, St. Louis, MO, USA) by SPPS of Leu-Enkephalin (YGGFL) or the coupling of commercially synthesized YGGFL peptides to (3-aminopropyl)triethyoxysilane-modified glass surfaces. Coupling efficiency was quantitated by fluorescence signaling based on immunoreactivity of themore » YGGFL motif. It was concluded that COMU solutions should be prepared fresh and used within 5 h when stored at ~23 °C and not beyond 24 h if stored refrigerated, both in closed containers. Caveats to gauging COMU stability by absorption spectroscopy are discussed. Commercial YGGFL peptides needed independent QC, due to immunoreactivity variations for the same sequence synthesized by different vendors. This technique is useful in evaluating the stability of other activation/coupling reagents besides COMU and as a metrology tool for SPPS and peptide microarrays.« less
Tang, Xiaoxiao; Qiao, Xiuying; Miller, Reinhard; Sun, Kang
2016-12-01
The amphiphilic character and surface activity endows silk fibroin with the ability to reside at fluid interfaces and effectively stabilize emulsions. However, the influence of relevant factors and their actual effect on the interfacial viscoelasticity and stability of silk fibroin at the oil/water interface has received less attention. In the present study, the effect of ionic strength on the interfacial viscoelasticity, emulsification effectiveness and stability of silk fibroin at the oil/water interface was investigated in detail. A higher ion concentration facilitates greater adsorption, stronger molecular interaction and faster structure reorganization of silk fibroin at the oil/water interface, thus causing quicker interfacial saturation adsorption, greater interfacial strength and lower interfacial structural fracture on large deformation. However, the presence of concentrated ions screens the charges in silk fibroin molecules and the zeta potential decreases as a result of electrostatic screening and ion-binding effects, which may result in emulsion droplet coalescence and a decrease in emulsion stability. The positively-charged ions significantly affect the interfacial elasticity and stability of silk fibroin layers at the oil/water interface as a result of the strong electrostatic interactions between counter-ions and the negatively-charged groups of silk fibroin. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Achyuthan, Komandoor E.; Wheeler, David R.
2015-08-27
Evaluating the stability of coupling reagents, quality control (QC), and surface functionalization metrology are all critical to the production of high quality peptide microarrays. We describe a broadly applicable screening technique for evaluating the fidelity of solid phase peptide synthesis (SPPS), the stability of activation/coupling reagents, and a microarray surface metrology tool. This technique was used to assess the stability of the activation reagent 1-{[1-(Cyano-2-ethoxy-2-oxo-ethylidenaminooxy)dimethylamino-morpholinomethylene]}methaneaminiumHexafluorophosphate (COMU) (Sigma-Aldrich, St. Louis, MO, USA) by SPPS of Leu-Enkephalin (YGGFL) or the coupling of commercially synthesized YGGFL peptides to (3-aminopropyl)triethyoxysilane-modified glass surfaces. Coupling efficiency was quantitated by fluorescence signaling based on immunoreactivity of themore » YGGFL motif. It was concluded that COMU solutions should be prepared fresh and used within 5 h when stored at ~23 °C and not beyond 24 h if stored refrigerated, both in closed containers. Caveats to gauging COMU stability by absorption spectroscopy are discussed. Commercial YGGFL peptides needed independent QC, due to immunoreactivity variations for the same sequence synthesized by different vendors. This technique is useful in evaluating the stability of other activation/coupling reagents besides COMU and as a metrology tool for SPPS and peptide microarrays.« less
Autism in the Faroe Islands: Diagnostic Stability from Childhood to Early Adult Life
Kočovská, Eva; Billstedt, Eva; Ellefsen, Asa; Kampmann, Hanna; Gillberg, I. Carina; Biskupstø, Rannvá; Andorsdóttir, Guðrið; Stóra, Tormóður; Minnis, Helen; Gillberg, Christopher
2013-01-01
Childhood autism or autism spectrum disorder (ASD) has been regarded as one of the most stable diagnostic categories applied to young children with psychiatric/developmental disorders. The stability over time of a diagnosis of ASD is theoretically interesting and important for various diagnostic and clinical reasons. We studied the diagnostic stability of ASD from childhood to early adulthood in the Faroe Islands: a total school age population sample (8–17-year-olds) was screened and diagnostically assessed for AD in 2002 and 2009. This paper compares both independent clinical diagnosis and Diagnostic Interview for Social and Communication Disorders (DISCO) algorithm diagnosis at two time points, separated by seven years. The stability of clinical ASD diagnosis was perfect for AD, good for “atypical autism”/PDD-NOS, and less than perfect for Asperger syndrome (AS). Stability of the DISCO algorithm subcategory diagnoses was more variable but still good for AD. Both systems showed excellent stability over the seven-year period for “any ASD” diagnosis, although a number of clear cases had been missed at the original screening in 2002. The findings support the notion that subcategories of ASD should be collapsed into one overarching diagnostic entity with subgrouping achieved on other “non-autism” variables, such as IQ and language levels and overall adaptive functioning. PMID:23476144
Wells, Stewart; Bullen, Chris
2008-01-01
This article describes the near failure of an information technology (IT) system designed to support a government-funded, primary care-based hepatitis B screening program in New Zealand. Qualitative methods were used to collect data and construct an explanatory model. Multiple incorrect assumptions were made about participants, primary care workflows and IT capacity, software vendor user knowledge, and the health IT infrastructure. Political factors delayed system development and it was implemented untested, almost failing. An intensive rescue strategy included system modifications, relaxation of data validity rules, close engagement with software vendors, and provision of intensive on-site user support. This case study demonstrates that consideration of the social, political, technological, and health care contexts is important for successful implementation of public health informatics projects.
Streamlining workflow and automation to accelerate laboratory scale protein production.
Konczal, Jennifer; Gray, Christopher H
2017-05-01
Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Byung-Ho; Hwang, Jonghee; Lee, Young Jin; Kim, Jin-Ho; Jeon, Dae-Woo; Lee, Mi Jai
2016-08-01
We developed a fabrication method for remote phosphor by a screen-printing process, using green phosphor, red phosphor, and thermally stable glass frit. The glass frit was introduced for long-term stability. The optical properties of the remote phosphor were observed via an integrating sphere; the photoluminescence spectrum dramatically changed on incorporating a minor amount of the red phosphor. These unique optical properties were elucidated using four factors: phosphor ratio, scattering induced by packing density, light intensity per unit volume, and reabsorption. The thermal stability of the remote phosphor was investigated at 500°C, demonstrating its outstanding thermal properties.
Neonatal endocrine emergencies: a primer for the emergency physician.
Park, Elizabeth; Pearson, Nadia M; Pillow, M Tyson; Toledo, Alexander
2014-05-01
The resuscitation principles of securing the airway and stabilizing hemodynamics remain the same in any neonatal emergency. However, stabilizing endocrine disorders may prove especially challenging. Several organ systems are affected simultaneously and the clinical presentation can be subtle. Although not all-inclusive, the implementation of newborn screening tests has significantly reduced morbidity and mortality in neonates. Implementing routine screening tests worldwide and improving the accuracy of present tests remains the challenge for healthcare providers. With further study of these disorders and best treatment practices we can provide neonates presenting to the emergency department with the best possible outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
1975-08-01
The purpose of this study was to determine the feasibility of using an : expansive cement, TXI 4C Chem Comp, in lieu of the regular Type I Portland : Cement in a cement stabilized gravel screenings base so as to eliminate : or reduce cracks associate...
An end-to-end workflow for engineering of biological networks from high-level specifications.
Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun
2012-08-17
We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.
Berggren, Elisabet; White, Andrew; Ouedraogo, Gladys; Paini, Alicia; Richarz, Andrea-Nicole; Bois, Frederic Y; Exner, Thomas; Leite, Sofia; Grunsven, Leo A van; Worth, Andrew; Mahony, Catherine
2017-11-01
We describe and illustrate a workflow for chemical safety assessment that completely avoids animal testing. The workflow, which was developed within the SEURAT-1 initiative, is designed to be applicable to cosmetic ingredients as well as to other types of chemicals, e.g. active ingredients in plant protection products, biocides or pharmaceuticals. The aim of this work was to develop a workflow to assess chemical safety without relying on any animal testing, but instead constructing a hypothesis based on existing data, in silico modelling, biokinetic considerations and then by targeted non-animal testing. For illustrative purposes, we consider a hypothetical new ingredient x as a new component in a body lotion formulation. The workflow is divided into tiers in which points of departure are established through in vitro testing and in silico prediction, as the basis for estimating a safe external dose in a repeated use scenario. The workflow includes a series of possible exit (decision) points, with increasing levels of confidence, based on the sequential application of the Threshold of Toxicological (TTC) approach, read-across, followed by an "ab initio" assessment, in which chemical safety is determined entirely by new in vitro testing and in vitro to in vivo extrapolation by means of mathematical modelling. We believe that this workflow could be applied as a tool to inform targeted and toxicologically relevant in vitro testing, where necessary, and to gain confidence in safety decision making without the need for animal testing.
Stabilization of glucose-oxidase in the graphene paste for screen-printed glucose biosensor
NASA Astrophysics Data System (ADS)
Pepłowski, Andrzej; Janczak, Daniel; Jakubowska, Małgorzata
2015-09-01
Various methods and materials for enzyme stabilization within screen-printed graphene sensor were analyzed. Main goal was to develop technology allowing immediate printing of the biosensors in single printing process. Factors being considered were: toxicity of the materials used, ability of the material to be screen-printed (squeezed through the printing mesh) and temperatures required in the fabrication process. Performance of the examined sensors was measured using chemical amperometry method, then appropriate analysis of the measurements was conducted. The analysis results were then compared with the medical requirements. Parameters calculated were: correlation coefficient between concentration of the analyte and the measured electrical current (0.986) and variation coefficient for the particular concentrations of the analyte used as the calibration points. Variation of the measured values was significant only in ranges close to 0, decreasing for the concentrations of clinical importance. These outcomes justify further development of the graphene-based biosensors fabricated through printing techniques.
Geotechnical applications of LiDAR pertaining to geomechanical evaluation and hazard identification
NASA Astrophysics Data System (ADS)
Lato, Matthew J.
Natural hazards related to ground movement that directly affect the safety of motorists and highway infrastructure include, but are not limited to, rockfalls, rockslides, debris flows, and landslides. This thesis specifically deals with the evaluation of rockfall hazards through the evaluation of LiDAR data. Light Detection And Ranging (LiDAR) is an imaging technology that can be used to delineate and evaluate geomechanically-controlled hazards. LiDAR has been adopted to conduct hazard evaluations pertaining to rockfall, rock-avalanches, debris flows, and landslides. Characteristics of LiDAR surveying, such as rapid data acquisition rates, mobile data collection, and high data densities, pose problems to traditional CAD or GIS-based mapping methods. New analyses methods, including tools specifically oriented to geomechanical analyses, are needed. The research completed in this thesis supports development of new methods, including improved survey techniques, innovative software workflows, and processing algorithms to aid in the detection and evaluation of geomechanically controlled rockfall hazards. The scientific research conducted between the years of 2006-2010, as presented in this thesis, are divided into five chapters, each of which has been published by or is under review by an international journal. The five research foci are: (i) geomechanical feature extraction and analysis using LiDAR data in active mining environments; (ii) engineered monitoring of rockfall hazards along transportation corridors: using mobile terrestrial LiDAR; (iii) optimization of LiDAR scanning and processing for automated structural evaluation of discontinuities in rockmasses; (iv) location orientation bias when using static LiDAR data for geomechanical analysis; and (v) evaluating roadside rockmasses for rockfall hazards from LiDAR data: optimizing data collection and processing protocols. The research conducted pertaining to this thesis has direct and significant implications with respect to numerous engineering projects that are affected by geomechanical stability issues. The ability to efficiently and accurately map discontinuities, detect changes, and standardize roadside geomechanical stability analyses from remote locations will fundamentally change the state-of-practice of geotechnical investigation workflows and repeatable monitoring. This, in turn, will lead to earlier detection and definition of potential zones of instability, will allow for progressive monitoring and risk analysis, and will indicate the need for pro-active slope improvement and stabilization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalinin, Sergei V.; Kim, Yunseok; Fong, Dillon D.
For over 70 years, ferroelectric materials have been one of the central research topics for condensed matter physics and material science, an interest driven both by fundamental science and applications. However, ferroelectric surfaces, the key component of ferroelectric films and nanostructures, still present a significant theoretical and even conceptual challenge. Indeed, stability of ferroelectric phase per se necessitates screening of polarization charge. At surfaces, this can lead to coupling between ferroelectric and semiconducting properties of material, or with surface (electro) chemistry, going well beyond classical models applicable for ferroelectric interfaces. In this review, we summarize recent studies of surface-screening phenomenamore » in ferroelectrics. We provide a brief overview of the historical understanding of the physics of ferroelectric surfaces, and existing theoretical models that both introduce screening mechanisms and explore the relationship between screening and relevant aspects of ferroelectric functionalities starting from phase stability itself. Given that the majority of ferroelectrics exist in multiple-domain states, we focus on local studies of screening phenomena using scanning probe microscopy techniques. We discuss recent studies of static and dynamic phenomena on ferroelectric surfaces, as well as phenomena observed under lateral transport, light, chemical, and pressure stimuli. We also note that the need for ionic screening renders polarization switching a coupled physical-electrochemical process and discuss the non-trivial phenomena such as chaotic behavior during domain switching that stem from this.« less
Molecular Determinants of Estrogen Receptor Alpha Stability
2008-07-01
presence of E2. This question can be addressed by a T7 phage display screen using a breast cancer cell library and DNA-bound ERα in the presence of...conformation of ERα induced by 27HC versus E2. To accomplish this, we performed combinatorial phage display using a modified M13 phage display screen
Cao, Xuan; Lau, Christian; Liu, Yihang; Wu, Fanqi; Gui, Hui; Liu, Qingzhou; Ma, Yuqiang; Wan, Haochuan; Amer, Moh R; Zhou, Chongwu
2016-11-22
Semiconducting single-wall carbon nanotubes are ideal semiconductors for printed electronics due to their advantageous electrical and mechanical properties, intrinsic printability in solution, and desirable stability in air. However, fully printed, large-area, high-performance, and flexible carbon nanotube active-matrix backplanes are still difficult to realize for future displays and sensing applications. Here, we report fully screen-printed active-matrix electrochromic displays employing carbon nanotube thin-film transistors. Our fully printed backplane shows high electrical performance with mobility of 3.92 ± 1.08 cm 2 V -1 s -1 , on-off current ratio I on /I off ∼ 10 4 , and good uniformity. The printed backplane was then monolithically integrated with an array of printed electrochromic pixels, resulting in an entirely screen-printed active-matrix electrochromic display (AMECD) with good switching characteristics, facile manufacturing, and long-term stability. Overall, our fully screen-printed AMECD is promising for the mass production of large-area and low-cost flexible displays for applications such as disposable tags, medical electronics, and smart home appliances.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guan, Qiang
At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows willmore » compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less
PGen: large-scale genomic variations analysis workflow and browser in SoyKB.
Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti
2016-10-06
With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.
Distributed Data Integration Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Critchlow, T; Ludaescher, B; Vouk, M
The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This can result in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. A related issue is keeping up with current trends in information technology often taxes the end-user's expertise and time. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and technologies, usemore » them almost exclusively, and develop a resistance to innovations that can enhance their productivity. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information and the latest technologies. In order to address this problem we are developing a end-user centric, domain-sensitive workflow-based infrastructure, shown in Figure 1, that will allow scientists to design complex scientific workflows that reflect the data manipulation required to perform their research without an undue burden. We are taking a three-tiered approach to designing this infrastructure utilizing (1) abstract workflow definition, construction, and automatic deployment, (2) complex agent-based workflow execution and (3) automatic wrapper generation. In order to construct a workflow, the scientist defines an abstract workflow (AWF) in terminology (semantics and context) that is familiar to him/her. This AWF includes all of the data transformations, selections, and analyses required by the scientist, but does not necessarily specify particular data sources. This abstract workflow is then compiled into an executable workflow (EWF, in our case XPDL) that is then evaluated and executed by the workflow engine. This EWF contains references to specific data source and interfaces capable of performing the desired actions. In order to provide access to the largest number of resources possible, our lowest level utilizes automatic wrapper generation techniques to create information and data wrappers capable of interacting with the complex interfaces typical in scientific analysis. The remainder of this document outlines our work in these three areas, the impact our work has made, and our plans for the future.« less
NASA Astrophysics Data System (ADS)
Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.
2007-12-01
Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.
Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Collins, Patrick; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian; von Delft, Frank
2017-03-01
XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein-ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011), Acta Cryst. D67, 235-242] or PHENIX [Adams et al. (2010), Acta Cryst. D66, 213-221] have entrenched the paradigm that a `project' is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects.
Sherwood, Carly A; Eastham, Ashley; Lee, Lik Wee; Risler, Jenni; Mirzaei, Hamid; Falkner, Jayson A; Martin, Daniel B
2009-07-01
Multiple reaction monitoring (MRM) is a highly sensitive method of targeted mass spectrometry (MS) that can be used to selectively detect and quantify peptides based on the screening of specified precursor peptide-to-fragment ion transitions. MRM-MS sensitivity depends critically on the tuning of instrument parameters, such as collision energy and cone voltage, for the generation of maximal product ion signal. Although generalized equations and values exist for such instrument parameters, there is no clear indication that optimal signal can be reliably produced for all types of MRM transitions using such an algorithmic approach. To address this issue, we have devised a workflow functional on both Waters Quattro Premier and ABI 4000 QTRAP triple quadrupole instruments that allows rapid determination of the optimal value of any programmable instrument parameter for each MRM transition. Here, we demonstrate the strategy for the optimizations of collision energy and cone voltage, but the method could be applied to other instrument parameters, such as declustering potential, as well. The workflow makes use of the incremental adjustment of the precursor and product m/z values at the hundredth decimal place to create a series of MRM targets at different collision energies that can be cycled through in rapid succession within a single run, avoiding any run-to-run variability in execution or comparison. Results are easily visualized and quantified using the MRM software package Mr. M to determine the optimal instrument parameters for each transition.
Sherwood, Carly A.; Eastham, Ashley; Lee, Lik Wee; Risler, Jenni; Mirzaei, Hamid; Falkner, Jayson A.; Martin, Daniel B.
2009-01-01
Multiple reaction monitoring (MRM) is a highly sensitive method of targeted mass spectrometry (MS) that can be used to selectively detect and quantify peptides based on the screening of specified precursor peptide-to-fragment ion transitions. MRM-MS sensitivity depends critically on the tuning of instrument parameters, such as collision energy and cone voltage, for the generation of maximal product ion signal. Although generalized equations and values exist for such instrument parameters, there is no clear indication that optimal signal can be reliably produced for all types of MRM transitions using such an algorithmic approach. To address this issue, we have devised a workflow functional on both Waters Quattro Premier and ABI 4000 QTRAP triple quadrupole instruments that allows rapid determination of the optimal value of any programmable instrument parameter for each MRM transition. Here, we demonstrate the strategy for the optimizations of collision energy and cone voltage, but the method could be applied to other instrument parameters, such as declustering potential, as well. The workflow makes use of the incremental adjustment of the precursor and product m/z values at the hundredth decimal place to create a series of MRM targets at different collision energies that can be cycled through in rapid succession within a single run, avoiding any run-to-run variability in execution or comparison. Results are easily visualized and quantified using the MRM software package Mr. M to determine the optimal instrument parameters for each transition. PMID:19405522
Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian
2017-01-01
XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein–ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011 ▸), Acta Cryst. D67, 235–242] or PHENIX [Adams et al. (2010 ▸), Acta Cryst. D66, 213–221] have entrenched the paradigm that a ‘project’ is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects. PMID:28291762
The equivalency between logic Petri workflow nets and workflow nets.
Wang, Jing; Yu, ShuXia; Du, YuYue
2015-01-01
Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin
The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less
Ergonomic design for dental offices.
Ahearn, David J; Sanders, Martha J; Turcotte, Claudia
2010-01-01
The increasing complexity of the dental office environment influences productivity and workflow for dental clinicians. Advances in technology, and with it the range of products needed to provide services, have led to sprawl in operatory setups and the potential for awkward postures for dental clinicians during the delivery of oral health services. Although ergonomics often addresses the prevention of musculoskeletal disorders for specific populations of workers, concepts of workflow and productivity are integral to improved practice in work environments. This article provides suggestions for improving workflow and productivity for dental clinicians. The article applies ergonomic principles to dental practice issues such as equipment and supply management, office design, and workflow management. Implications for improved ergonomic processes and future research are explored.
The Equivalency between Logic Petri Workflow Nets and Workflow Nets
Wang, Jing; Yu, ShuXia; Du, YuYue
2015-01-01
Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845
Diabetes screening anxiety and beliefs.
Skinner, T C; Davies, M J; Farooqi, A M; Jarvis, J; Tringham, J R; Khunti, K
2005-11-01
This study assesses the impact of screening for diabetes on anxiety levels in an ethnically mixed population in the UK, and explores whether beliefs about Type 2 diabetes account for these anxiety levels. This cross-sectional study recruited individuals who were identified at high risk of developing diabetes through general practitioners' (GPs) lists or through public media recruitment. Participants completed an oral glucose tolerance test (OGTT). Between blood tests, participants completed the Spielberger State Anxiety Scale Short Form, the Emotional Stability Scale of the Big Five Inventory 44 and three scales from the Diabetes Illness Representations Questionnaire, revised for this study. Of the 1339 who completed the OGTT and questionnaire booklet, 54% were female, with 21% from an Asian background. Forty-five per cent of participants reported little to moderate amounts of anxiety at screening (mean 35.2; sd = 11.6). There was no significant effect of family history of diabetes, ethnic group or recruitment method on anxiety. The only variable significantly associated (negatively) with anxiety was the personality trait of emotional stability. Of responders, 64% and 61% agreed that diabetes was caused by diet or hereditary factors, respectively. Only 155 individuals (12%) agreed that diabetes was serious, shortens life and causes complications. The results of this study replicate that of previous studies, indicating that screening for diabetes does not induce significant anxiety. Bivariate analysis indicated that individuals who perceived diabetes to be serious, life shortening and resulting in complications had higher anxiety scores, the personality trait of emotional stability being the strongest predictor of anxiety.
He, Dong; Luo, Wen; Wang, Zhiyuan; Lv, Pengmei; Yuan, Zhenhong; Huang, Shaowei; Xv, Jingliang
2017-07-01
Directed evolution has been proved an effective way to improve the stability of proteins, but high throughput screening assays for directed evolution with simultaneous improvement of two or more properties are still rare. In this study, we aimed to establish a membrane-blot assay for use in the high-throughput screening of Rhizomucor miehei lipases (RMLs). With the assistance of the membrane-blot screening assay, a mutant E47K named G10 that showed improved thermal stability was detected in the first round of error-prone PCR. Using G10 as the parent, two variants G10-11 and G10-20 that showed improved thermal stability and methanol tolerance without loss of activity compared to the wild type RML were obtained. The T 50 60 -value of G10-11 and G10-20 increased by 12°C and 6.5°C, respectively. After incubation for 1h, the remaining residual activity of G10-11 and G10-20 was 63.45% and 74.33%, respectively, in 50% methanol, and 15.98% and 30.22%, respectively, in 80% methanol. Thus, we successfully developed a membrane-blot assay that could be used for the high-throughput screening of RMLs with improved thermostability and methanol tolerance. Based on our findings, we believe that our newly developed membrane-blot assay will have potential applications in directed evolution in the future. Copyright © 2017 Elsevier Inc. All rights reserved.