Sample records for workflow design implementation

  1. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    PubMed

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  2. DEWEY: the DICOM-enabled workflow engine system.

    PubMed

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  3. Design and implementation of workflow engine for service-oriented architecture

    NASA Astrophysics Data System (ADS)

    Peng, Shuqing; Duan, Huining; Chen, Deyun

    2009-04-01

    As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.

  4. Information Issues and Contexts that Impair Team Based Communication Workflow: A Palliative Sedation Case Study.

    PubMed

    Cornett, Alex; Kuziemsky, Craig

    2015-01-01

    Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.

  5. Managing and Communicating Operational Workflow

    PubMed Central

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  6. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging

    PubMed Central

    Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470

  7. [Integration of the radiotherapy irradiation planning in the digital workflow].

    PubMed

    Röhner, F; Schmucker, M; Henne, K; Momm, F; Bruggmoser, G; Grosu, A-L; Frommhold, H; Heinemann, F E

    2013-02-01

    At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority.

  8. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.

  9. Agile parallel bioinformatics workflow management using Pwrake.

    PubMed

    Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro

    2011-09-08

    In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows.

  10. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows. PMID:21899774

  11. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  12. Examining the Multi-level Fit between Work and Technology in a Secure Messaging Implementation.

    PubMed

    Ozkaynak, Mustafa; Johnson, Sharon; Shimada, Stephanie; Petrakis, Beth Ann; Tulu, Bengisu; Archambeault, Cliona; Fix, Gemmae; Schwartz, Erin; Woods, Susan

    2014-01-01

    Secure messaging (SM) allows patients to communicate with their providers for non-urgent health issues. Like other health information technologies, the design and implementation of SM should account for workflow to avoid suboptimal outcomes. SM may present unique workflow challenges because patients add a layer of complexity, as they are also direct users of the system. This study explores SM implementation at two Veterans Health Administration facilities. We interviewed twenty-nine members of eight primary care teams using semi-structured interviews. Questions addressed staff opinions about the integration of SM with daily practice, and team members' attitudes and experiences with SM. We describe the clinical workflow for SM, examining complexity and variability. We identified eight workflow issues directly related to efficiency and patient satisfaction, based on an exploration of the technology fit with multilevel factors. These findings inform organizational interventions that will accommodate SM implementation and lead to more patient-centered care.

  13. FAST: A fully asynchronous and status-tracking pattern for geoprocessing services orchestration

    NASA Astrophysics Data System (ADS)

    Wu, Huayi; You, Lan; Gui, Zhipeng; Gao, Shuang; Li, Zhenqiang; Yu, Jingmin

    2014-09-01

    Geoprocessing service orchestration (GSO) provides a unified and flexible way to implement cross-application, long-lived, and multi-step geoprocessing service workflows by coordinating geoprocessing services collaboratively. Usually, geoprocessing services and geoprocessing service workflows are data and/or computing intensive. The intensity feature may make the execution process of a workflow time-consuming. Since it initials an execution request without blocking other interactions on the client side, an asynchronous mechanism is especially appropriate for GSO workflows. Many critical problems remain to be solved in existing asynchronous patterns for GSO including difficulties in improving performance, status tracking, and clarifying the workflow structure. These problems are a challenge when orchestrating performance efficiency, making statuses instantly available, and constructing clearly structured GSO workflows. A Fully Asynchronous and Status-Tracking (FAST) pattern that adopts asynchronous interactions throughout the whole communication tier of a workflow is proposed for GSO. The proposed FAST pattern includes a mechanism that actively pushes the latest status to clients instantly and economically. An independent proxy was designed to isolate the status tracking logic from the geoprocessing business logic, which assists the formation of a clear GSO workflow structure. A workflow was implemented in the FAST pattern to simulate the flooding process in the Poyang Lake region. Experimental results show that the proposed FAST pattern can efficiently tackle data/computing intensive geoprocessing tasks. The performance of all collaborative partners was improved due to the asynchronous mechanism throughout communication tier. A status-tracking mechanism helps users retrieve the latest running status of a GSO workflow in an efficient and instant way. The clear structure of the GSO workflow lowers the barriers for geospatial domain experts and model designers to compose asynchronous GSO workflows. Most importantly, it provides better support for locating and diagnosing potential exceptions.

  14. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  15. Improving Clinical Workflow in Ambulatory Care: Implemented Recommendations in an Innovation Prototype for the Veteran’s Health Administration

    PubMed Central

    Patterson, Emily S.; Lowry, Svetlana Z.; Ramaiah, Mala; Gibbons, Michael C.; Brick, David; Calco, Robert; Matton, Greg; Miller, Anne; Makar, Ellen; Ferrer, Jorge A.

    2015-01-01

    Introduction: Human factors workflow analyses in healthcare settings prior to technology implemented are recommended to improve workflow in ambulatory care settings. In this paper we describe how insights from a workflow analysis conducted by NIST were implemented in a software prototype developed for a Veteran’s Health Administration (VHA) VAi2 innovation project and associated lessons learned. Methods: We organize the original recommendations and associated stages and steps visualized in process maps from NIST and the VA’s lessons learned from implementing the recommendations in the VAi2 prototype according to four stages: 1) before the patient visit, 2) during the visit, 3) discharge, and 4) visit documentation. NIST recommendations to improve workflow in ambulatory care (outpatient) settings and process map representations were based on reflective statements collected during one-hour discussions with three physicians. The development of the VAi2 prototype was conducted initially independently from the NIST recommendations, but at a midpoint in the process development, all of the implementation elements were compared with the NIST recommendations and lessons learned were documented. Findings: Story-based displays and templates with default preliminary order sets were used to support scheduling, time-critical notifications, drafting medication orders, and supporting a diagnosis-based workflow. These templates enabled customization to the level of diagnostic uncertainty. Functionality was designed to support cooperative work across interdisciplinary team members, including shared documentation sessions with tracking of text modifications, medication lists, and patient education features. Displays were customized to the role and included access for consultants and site-defined educator teams. Discussion: Workflow, usability, and patient safety can be enhanced through clinician-centered design of electronic health records. The lessons learned from implementing NIST recommendations to improve workflow in ambulatory care using an EHR provide a first step in moving from a billing-centered perspective on how to maintain accurate, comprehensive, and up-to-date information about a group of patients to a clinician-centered perspective. These recommendations point the way towards a “patient visit management system,” which incorporates broader notions of supporting workload management, supporting flexible flow of patients and tasks, enabling accountable distributed work across members of the clinical team, and supporting dynamic tracking of steps in tasks that have longer time distributions. PMID:26290887

  16. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    PubMed

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  17. Design and implementation of a secure workflow system based on PKI/PMI

    NASA Astrophysics Data System (ADS)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  18. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  19. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  20. Towards an intelligent hospital environment: OR of the future.

    PubMed

    Sutherland, Jeffrey V; van den Heuvel, Willem-Jan; Ganous, Tim; Burton, Matthew M; Kumar, Animesh

    2005-01-01

    Patients, providers, payers, and government demand more effective and efficient healthcare services, and the healthcare industry needs innovative ways to re-invent core processes. Business process reengineering (BPR) showed adopting new hospital information systems can leverage this transformation and workflow management technologies can automate process management. Our research indicates workflow technologies in healthcare require real time patient monitoring, detection of adverse events, and adaptive responses to breakdown in normal processes. Adaptive workflow systems are rarely implemented making current workflow implementations inappropriate for healthcare. The advent of evidence based medicine, guideline based practice, and better understanding of cognitive workflow combined with novel technologies including Radio Frequency Identification (RFID), mobile/wireless technologies, internet workflow, intelligent agents, and Service Oriented Architectures (SOA) opens up new and exciting ways of automating business processes. Total situational awareness of events, timing, and location of healthcare activities can generate self-organizing change in behaviors of humans and machines. A test bed of a novel approach towards continuous process management was designed for the new Weinburg Surgery Building at the University of Maryland Medical. Early results based on clinical process mapping and analysis of patient flow bottlenecks demonstrated 100% improvement in delivery of supplies and instruments at surgery start time. This work has been directly applied to the design of the DARPA Trauma Pod research program where robotic surgery will be performed on wounded soldiers on the battlefield.

  1. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  2. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  3. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    PubMed

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  4. Modeling Complex Workflow in Molecular Diagnostics

    PubMed Central

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  5. Integrating Remote and Social Sensing Data for a Scenario on Secure Societies in Big Data Platform

    NASA Astrophysics Data System (ADS)

    Albani, Sergio; Lazzarini, Michele; Koubarakis, Manolis; Taniskidou, Efi Karra; Papadakis, George; Karkaletsis, Vangelis; Giannakopoulos, George

    2016-08-01

    In the framework of the Horizon 2020 project BigDataEurope (Integrating Big Data, Software & Communities for Addressing Europe's Societal Challenges), a pilot for the Secure Societies Societal Challenge was designed considering the requirements coming from relevant stakeholders. The pilot is focusing on the integration in a Big Data platform of data coming from remote and social sensing.The information on land changes coming from the Copernicus Sentinel 1A sensor (Change Detection workflow) is integrated with information coming from selected Twitter and news agencies accounts (Event Detection workflow) in order to provide the user with multiple sources of information.The Change Detection workflow implements a processing chain in a distributed parallel manner, exploiting the Big Data capabilities in place; the Event Detection workflow implements parallel and distributed social media and news agencies monitoring as well as suitable mechanisms to detect and geo-annotate the related events.

  6. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    PubMed

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  7. An Automated Design Framework for Multicellular Recombinase Logic.

    PubMed

    Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome

    2018-05-18

    Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.

  8. High-volume workflow management in the ITN/FBI system

    NASA Astrophysics Data System (ADS)

    Paulson, Thomas L.

    1997-02-01

    The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.

  9. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    PubMed Central

    Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R

    2007-01-01

    Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870

  10. Conceptual-level workflow modeling of scientific experiments using NMR as a case study.

    PubMed

    Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R

    2007-01-30

    Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.

  11. Scaffolding the Development of Problem-Solving Skills in Chemistry: Guiding Novice Students out of Dead Ends and False Starts

    ERIC Educational Resources Information Center

    Yuriev, Elizabeth; Naidu, Som; Schembri, Luke S.; Short, Jennifer L.

    2017-01-01

    To scaffold the development of problem-solving skills in chemistry, chemistry educators are exploring a variety of instructional techniques. In this study, we have designed, implemented, and evaluated a problem-solving workflow--''Goldilocks Help''. This workflow builds on work done in the field of problem solving in chemistry and provides…

  12. Metaworkflows and Workflow Interoperability for Heliophysics

    NASA Astrophysics Data System (ADS)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They- implement Science Cases (the definition of a scientific challenge) by composing different Basic Workflows. The third and last layer,Iterative Science Workflows, is developed in WSPGRADE. It executes sub-workflows (either Basic or Science Workflows) as parameter sweep jobs to investigate Science Cases on large multiple data sets. So far, this approach has proven fruitful for three Science Cases of which one has been completed and two are still being tested.

  13. Development of a digital clinical pathway for emergency medicine: Lessons from usability testing and implementation failure.

    PubMed

    Gutenstein, Marc; Pickering, John W; Than, Martin

    2018-06-01

    Clinical pathways are used to support the management of patients in emergency departments. An existing document-based clinical pathway was used as the foundation on which to design and build a digital clinical pathway for acute chest pain, with the aim of improving clinical calculations, clinician decision-making, documentation, and data collection. Established principles of decision support system design were used to build an application within the existing electronic health record, before testing with a multidisciplinary team of doctors using a think-aloud protocol. Technical authoring was successful, however, usability testing revealed that the user experience and the flexibility of workflow within the application were critical barriers to implementation. Emergency medicine and acute care decision support systems face particular challenges to existing models of linear workflow that should be deliberately addressed in digital pathway design. We make key recommendations regarding digital pathway design in emergency medicine.

  14. Facilitating hydrological data analysis workflows in R: the RHydro package

    NASA Astrophysics Data System (ADS)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges of the RHydro package, including integration with big data technologies, web technologies, and emerging data models in hydrology.

  15. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE PAGES

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...

    2016-07-21

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  16. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  17. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  18. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    PubMed Central

    Mathew, Cherian; Obst, Matthias; Vicario, Saverio; Haines, Robert; Williams, Alan R.; de Jong, Yde; Goble, Carole

    2014-01-01

    Abstract The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users. PMID:25535486

  19. Describing and Modeling Workflow and Information Flow in Chronic Disease Care

    PubMed Central

    Unertl, Kim M.; Weinger, Matthew B.; Johnson, Kevin B.; Lorenzi, Nancy M.

    2009-01-01

    Objectives The goal of the study was to develop an in-depth understanding of work practices, workflow, and information flow in chronic disease care, to facilitate development of context-appropriate informatics tools. Design The study was conducted over a 10-month period in three ambulatory clinics providing chronic disease care. The authors iteratively collected data using direct observation and semi-structured interviews. Measurements The authors observed all aspects of care in three different chronic disease clinics for over 150 hours, including 157 patient-provider interactions. Observation focused on interactions among people, processes, and technology. Observation data were analyzed through an open coding approach. The authors then developed models of workflow and information flow using Hierarchical Task Analysis and Soft Systems Methodology. The authors also conducted nine semi-structured interviews to confirm and refine the models. Results The study had three primary outcomes: models of workflow for each clinic, models of information flow for each clinic, and an in-depth description of work practices and the role of health information technology (HIT) in the clinics. The authors identified gaps between the existing HIT functionality and the needs of chronic disease providers. Conclusions In response to the analysis of workflow and information flow, the authors developed ten guidelines for design of HIT to support chronic disease care, including recommendations to pursue modular approaches to design that would support disease-specific needs. The study demonstrates the importance of evaluating workflow and information flow in HIT design and implementation. PMID:19717802

  20. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    PubMed

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  1. Workflow management systems in radiology

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  2. The impact of electronic medical record systems on outpatient workflows: a longitudinal evaluation of its workflow effects.

    PubMed

    Vishwanath, Arun; Singh, Sandeep Rajan; Winkelstein, Peter

    2010-11-01

    The promise of the electronic medical record (EMR) lies in its ability to reduce the costs of health care delivery and improve the overall quality of care--a promise that is realized through major changes in workflows within the health care organization. Yet little systematic information exists about the workflow effects of EMRs. Moreover, some of the research to-date points to reduced satisfaction among physicians after implementation of the EMR and increased time, i.e., negative workflow effects. A better understanding of the impact of the EMR on workflows is, hence, vital to understanding what the technology really does offer that is new and unique. (i) To empirically develop a physician centric conceptual model of the workflow effects of EMRs; (ii) To use the model to understand the antecedents to the physicians' workflow expectation from the new EMR; (iii) To track physicians' satisfaction overtime, 3 months and 20 months after implementation of the EMR; (iv) To explore the impact of technology learning curves on physicians' reported satisfaction levels. The current research uses the mixed-method technique of concept mapping to empirically develop the conceptual model of an EMR's workflow effects. The model is then used within a controlled study to track physician expectations from a new EMR system as well as their assessments of the EMR's performance 3 months and 20 months after implementation. The research tracks the actual implementation of a new EMR within the outpatient clinics of a large northeastern research hospital. The pre-implementation survey netted 20 physician responses; post-implementation Time 1 survey netted 22 responses, and Time 2 survey netted 26 physician responses. The implementation of the actual EMR served as the intervention. Since the study was conducted within the same setting and tracked a homogenous group of respondents, the overall study design ensured against extraneous influences on the results. Outcome measures were derived empirically from the conceptual model. They included 85 items that measured physician perceptions of the EMR's workflow effect on the following eight issues: (1) administration, (2) efficiency in patient processing, (3) basic clinical processes, (4) documentation of patient encounter, (5) economic challenges and reimbursement, (6) technical issues, (7) patient safety and care, and (8) communication and confidentiality. The items were used to track expectations prior to implementation and they served as retrospective measures of satisfaction with the EMR in post-implementation Time 1 and Time 2. The findings suggest that physicians conceptualize EMRs as an incremental extension of older computerized provider order entries (CPOEs) rather than as a new innovation. The EMRs major functional advantages are seen to be very similar to, if not the same as, those of CPOEs. Technology learning curves play a statistically significant though minor role in shaping physician perceptions. The physicians' expectations from the EMR are based on their prior beliefs rather than on a rational evaluation of the EMR's fit, functionality, or performance. Their decision regarding the usefulness of the EMR is made very early, within the first few months of use of the EMR. These early perceptions then remain stable and become the lens through which subsequent experience with the EMR is interpreted. The findings suggest a need for communication based interventions aimed at explaining the value, fit, and usefulness of EMRs to physicians early in the pre- and immediate post-EMR implementation stages. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Worklist handling in workflow-enabled radiological application systems

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  4. Integrated pathway-based transcription regulation network mining and visualization based on gene expression profiles.

    PubMed

    Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko

    2016-06-01

    Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Medical Devices Transition to Information Systems: Lessons Learned

    PubMed Central

    Charters, Kathleen G.

    2012-01-01

    Medical devices designed to network can share data with a Clinical Information System (CIS), making that data available within clinician workflow. Some lessons learned by transitioning anesthesia reporting and monitoring devices (ARMDs) on a local area network (LAN) to integration of anesthesia documentation within a CIS include the following categories: access, contracting, deployment, implementation, planning, security, support, training and workflow integration. Areas identified for improvement include: Vendor requirements for access reconciled with the organizations’ security policies and procedures. Include clauses supporting transition from stand-alone devices to information integrated into clinical workflow in the medical device procurement contract. Resolve deployment and implementation barriers that make the process less efficient and more costly. Include effective field communication and creative alternatives in planning. Build training on the baseline knowledge of trainees. Include effective help desk processes and metrics. Have a process for determining where problems originate when systems share information. PMID:24199054

  6. Understanding the dispensary workflow at the Birmingham Free Clinic: a proposed framework for an informatics intervention.

    PubMed

    Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P

    2016-02-19

    The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC dispensary, we believe it will be generalizable to pharmacies in other low-resource settings, both domestically and internationally.

  7. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  8. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  9. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  10. Hydrography for the non-Hydrographer: A Paradigm shift in Data Processing

    NASA Astrophysics Data System (ADS)

    Malzone, C.; Bruce, S.

    2017-12-01

    Advancements in technology have led to overall systematic improvements including; hardware design, software architecture, data transmission/ telepresence. Historically, utilization of this technology has required a high knowledge level obtained with many years of experience, training and/or education. High training costs are incurred to achieve and maintain an acceptable level proficiency within an organization. Recently, engineers have developed off-the-shelf software technology called Qimera that has simplified the processing of hydrographic data. The core technology is centered around the isolation of tasks within the work- flow to capitalize on the technological advances in computing technology to automate the mundane error prone tasks to bring more value to the stages in which the human brain brings value. Key design features include: guided workflow, transcription automation, processing state management, real-time QA, dynamic workflow for validation, collaborative cleaning and production line processing. Since, Qimera is designed to guide the user, it allows expedition leaders to focus on science while providing an educational opportunity for students to quickly learn the hydrographic processing workflow including ancillary data analysis, trouble-shooting, calibration and cleaning. This paper provides case studies on how Qimera is currently implemented in scientific expeditions, benefits of implementation and how it is directing the future of on-board research for the non-hydrographer.

  11. A case study on the impacts of computerized provider order entry (CPOE) system on hospital clinical workflow.

    PubMed

    Mominah, Maher; Yunus, Faisel; Househ, Mowafa S

    2013-01-01

    Computerized provider order entry (CPOE) is a health informatics system that helps health care providers create and manage orders for medications and other health care services. Through the automation of the ordering process, CPOE has improved the overall efficiency of hospital processes and workflow. In Saudi Arabia, CPOE has been used for years, with only a few studies evaluating the impacts of CPOE on clinical workflow. In this paper, we discuss the experience of a local hospital with the use of CPOE and its impacts on clinical workflow. Results show that there are many issues related to the implementation and use of CPOE within Saudi Arabia that must be addressed, including design, training, medication errors, alert fatigue, and system dep Recommendations for improving CPOE use within Saudi Arabia are also discussed.

  12. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    NASA Technical Reports Server (NTRS)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  13. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    PubMed

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  14. Evaluation of user interface and workflow design of a bedside nursing clinical decision support system.

    PubMed

    Yuan, Michael Juntao; Finley, George Mike; Long, Ju; Mills, Christy; Johnson, Ron Kim

    2013-01-31

    Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device. A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool. The evaluation has shown that our design was functional and met the requirements demanded by the nurses' tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction.

  15. Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System

    NASA Technical Reports Server (NTRS)

    Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan

    2009-01-01

    A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.

  16. Cyberinfrastructure for End-to-End Environmental Explorations

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Kumar, S.; Song, C.; Zhao, L.; Govindaraju, R.; Niyogi, D.

    2007-12-01

    The design and implementation of a cyberinfrastructure for End-to-End Environmental Exploration (C4E4) is presented. The C4E4 framework addresses the need for an integrated data/computation platform for studying broad environmental impacts by combining heterogeneous data resources with state-of-the-art modeling and visualization tools. With Purdue being a TeraGrid Resource Provider, C4E4 builds on top of the Purdue TeraGrid data management system and Grid resources, and integrates them through a service-oriented workflow system. It allows researchers to construct environmental workflows for data discovery, access, transformation, modeling, and visualization. Using the C4E4 framework, we have implemented an end-to-end SWAT simulation and analysis workflow that connects our TeraGrid data and computation resources. It enables researchers to conduct comprehensive studies on the impact of land management practices in the St. Joseph watershed using data from various sources in hydrologic, atmospheric, agricultural, and other related disciplines.

  17. An Adaptable Seismic Data Format for Modern Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.

    2013-12-01

    Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  19. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  20. Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.

    2013-12-01

    The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.

  1. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    PubMed

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  2. The use of workflows in the design and implementation of complex experiments in macromolecular crystallography.

    PubMed

    Brockhauser, Sandor; Svensson, Olof; Bowler, Matthew W; Nanao, Max; Gordon, Elspeth; Leal, Ricardo M F; Popov, Alexander; Gerring, Matthew; McCarthy, Andrew A; Gotz, Andy

    2012-08-01

    The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE.

  3. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  4. Evaluation of User Interface and Workflow Design of a Bedside Nursing Clinical Decision Support System

    PubMed Central

    Yuan, Michael Juntao; Finley, George Mike; Mills, Christy; Johnson, Ron Kim

    2013-01-01

    Background Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Objective Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Methods Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device. Results A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool. Conclusions The evaluation has shown that our design was functional and met the requirements demanded by the nurses’ tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction. PMID:23612350

  5. Bridging the Guideline Implementation Gap: A Systematic, Document-Centered Approach to Guideline Implementation

    PubMed Central

    Shiffman, Richard N.; Michel, George; Essaihi, Abdelwaheb; Thornquist, Elizabeth

    2004-01-01

    Objective: A gap exists between the information contained in published clinical practice guidelines and the knowledge and information that are necessary to implement them. This work describes a process to systematize and make explicit the translation of document-based knowledge into workflow-integrated clinical decision support systems. Design: This approach uses the Guideline Elements Model (GEM) to represent the guideline knowledge. Implementation requires a number of steps to translate the knowledge contained in guideline text into a computable format and to integrate the information into clinical workflow. The steps include: (1) selection of a guideline and specific recommendations for implementation, (2) markup of the guideline text, (3) atomization, (4) deabstraction and (5) disambiguation of recommendation concepts, (6) verification of rule set completeness, (7) addition of explanations, (8) building executable statements, (9) specification of origins of decision variables and insertions of recommended actions, (10) definition of action types and selection of associated beneficial services, (11) choice of interface components, and (12) creation of requirement specification. Results: The authors illustrate these component processes using examples drawn from recent experience translating recommendations from the National Heart, Lung, and Blood Institute's guideline on management of chronic asthma into a workflow-integrated decision support system that operates within the Logician electronic health record system. Conclusion: Using the guideline document as a knowledge source promotes authentic translation of domain knowledge and reduces the overall complexity of the implementation task. From this framework, we believe that a better understanding of activities involved in guideline implementation will emerge. PMID:15187061

  6. Implementation of workflow engine technology to deliver basic clinical decision support functionality.

    PubMed

    Huser, Vojtech; Rasmussen, Luke V; Oberg, Ryan; Starren, Justin B

    2011-04-10

    Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform.

  7. The role of technical advances in the adoption and integration of patient-reported outcomes in clinical care.

    PubMed

    Jensen, Roxanne E; Rothrock, Nan E; DeWitt, Esi M; Spiegel, Brennan; Tucker, Carole A; Crane, Heidi M; Forrest, Christopher B; Patrick, Donald L; Fredericksen, Rob; Shulman, Lisa M; Cella, David; Crane, Paul K

    2015-02-01

    Patient-reported outcomes (PROs) are gaining recognition as key measures for improving the quality of patient care in clinical care settings. Three factors have made the implementation of PROs in clinical care more feasible: increased use of modern measurement methods in PRO design and validation, rapid progression of technology (eg, touchscreen tablets, Internet accessibility, and electronic health records), and greater demand for measurement and monitoring of PROs by regulators, payers, accreditors, and professional organizations. As electronic PRO collection and reporting capabilities have improved, the challenges of collecting PRO data have changed. To update information on PRO adoption considerations in clinical care, highlighting electronic and technical advances with respect to measure selection, clinical workflow, data infrastructure, and outcomes reporting. Five practical case studies across diverse health care settings and patient populations are used to explore how implementation barriers were addressed to promote the successful integration of PRO collection into the clinical workflow. The case studies address selecting and reporting of relevant content, workflow integration, previsit screening, effective evaluation, and electronic health record integration. These case studies exemplify elements of well-designed electronic systems, including response automation, tailoring of item selection and reporting algorithms, flexibility of collection location, and integration with patient health care data elements. They also highlight emerging logistical barriers in this area, such as the need for specialized technological and methodological expertise, and design limitations of current electronic data capture systems.

  8. The Role of Technical Advances in the Adoption and Integration of Patient-Reported Outcomes in Clinical Care

    PubMed Central

    Jensen, Roxanne E.; Rothrock, Nan E.; DeWitt, Esi Morgan; Spiegel, Brennan; Tucker, Carole A.; Crane, Heidi M.; Forrest, Christopher B.; Patrick, Donald L.; Fredericksen, Rob; Shulman, Lisa M.; Cella, David; Crane, Paul K.

    2016-01-01

    Background Patient-reported outcomes (PROs) are gaining recognition as key measures for improving the quality of patient care in clinical care settings. Three factors have made the implementation of PROs in clinical care more feasible: increased use of modern measurement methods in PRO design and validation, rapid progression of technology (e.g., touch screen tablets, Internet accessibility, and electronic health records (EHRs)), and greater demand for measurement and monitoring of PROs by regulators, payers, accreditors, and professional organizations. As electronic PRO collection and reporting capabilities have improved, the challenges of collecting PRO data have changed. Objectives To update information on PRO adoption considerations in clinical care, highlighting electronic and technical advances with respect to measure selection, clinical workflow, data infrastructure, and outcomes reporting. Methods Five practical case studies across diverse healthcare settings and patient populations are used to explore how implementation barriers were addressed to promote the successful integration of PRO collection into the clinical workflow. The case studies address selecting and reporting of relevant content, workflow integration, pre-visit screening, effective evaluation, and EHR integration. Conclusions These case studies exemplify elements of well-designed electronic systems, including response automation, tailoring of item selection and reporting algorithms, flexibility of collection location, and integration with patient health care data elements. They also highlight emerging logistical barriers in this area, such as the need for specialized technological and methodological expertise, and design limitations of current electronic data capture systems. PMID:25588135

  9. SISYPHUS: A high performance seismic inversion factory

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with branches for the static process setup, inversion iterations, and solver runs, each branch specifying information at the event, station and channel levels. The workflow management framework is based on an embedded scripting engine that allows definition of various workflow scenarios using a high-level scripting language and provides access to all available inversion components represented as standard library functions. At present the SES3D wave propagation solver is integrated in the solution; the work is in progress for interfacing with SPECFEM3D. A separate framework is designed for interoperability with an optimization module; the workflow manager and optimization process run in parallel and cooperate by exchanging messages according to a specially designed protocol. A library of high-performance modules implementing signal pre-processing, misfit and adjoint computations according to established good practices is included. Monitoring is based on information stored in the inversion state database and at present implements a command line interface; design of a graphical user interface is in progress. The software design fits well into the common massively parallel system architecture featuring a large number of computational nodes running distributed applications under control of batch-oriented resource managers. The solution prototype has been implemented on the "Piz Daint" supercomputer provided by the Swiss Supercomputing Centre (CSCS).

  10. [Change in process management by implementing RIS, PACS and flat-panel detectors].

    PubMed

    Imhof, H; Dirisamer, A; Fischer, H; Grampp, S; Heiner, L; Kaderk, M; Krestan, C; Kainberger, F

    2002-05-01

    Implementation of radiological information systems (RIS) and picture archiving and communicating systems (PACS) results in significant changes of workflow in a radiological department. Additional connection with flat-panel detectors leads to a shortening of the work process. RIS and PACS implementation alone reduces the complete workflow by 21-80%. With flatpanel technology the image production process is further shortened by 25-30%. The workflow-steps are changed from original 17-12 with the implementation of RIS and PACS and to 5 with the integrated use of flatpanels. This clearly recognizable advantages in the workflow need an according financial investment. Several studies could show that the capitalisation-factor calculated over eight years is positive, with a gain range between 5-25%. Whether the additional implementation of flatpanel detectors results also in a positive capitalisation over the years, cannot be estimated exactly, at the moment, because the experiences are too short. Particularly critical are the interfaces, which needs a constant quality control. Our flatpanel detector-system is fixed, special images--as we have them in about 3-5% of all cases--need still conventional filmscreen or phosphorplate-systems. Full-spine and long-leg examinations cannot be performed with sufficient exactness. Without any questions implementation of integrated RIS, PACS and flatpanel detector-system needs excellent training of the employees, because of the changes in workflow etc. The main profits of such an integrated implementation are an increase in quality in image and report datas, easier handling--there are almost no more cassettes necessary--and excessive shortening of workflow.

  11. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  12. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    PubMed Central

    Ho, Jonhan; Aridor, Orly; Parwani, Anil V.

    2012-01-01

    Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users’ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists’ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number of processes that may possibly improve following the introduction of digital pathology systems were identified. These work processes included case management, case examination and review, and final case reporting. Furthermore, a digital slide system should integrate with the anatomic pathologic laboratory information system. Conclusions: To our knowledge, this is the first study that utilized the contextual inquiry method to document AP workflow. Findings were used to establish key requirements for the design of digital pathology systems. PMID:23243553

  13. 48 CFR 1.301 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... organizational level (e.g., designations and delegations of authority, assignments of responsibilities, work-flow....) as implemented in 5 CFR part 1320 (see 1.105) and the Regulatory Flexibility Act (5 U.S.C. 601, et seq.). Normally, when a law requires publication of a proposed regulation, the Regulatory Flexibility...

  14. UK Health and Social Care Case Studies: Iterative Technology Development.

    PubMed

    Blanchard, Adie; Gilbert, Laura; Dawson, Tom

    2017-01-01

    As a result of increasing demand in the face of reducing resources, technology has been implemented in many social and health care services to improve service efficiency. This paper outlines the experiences of deploying a 'Software as a Service' application in the UK social and health care sectors. The case studies demonstrate that every implementation is different, and unique to each organisation. Technology design and integration can be facilitated by ongoing engagement and collaboration with all stakeholders, flexible design, and attention to interoperability to suit services and their workflows.

  15. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  16. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  17. Rethinking Clinical Workflow.

    PubMed

    Schlesinger, Joseph J; Burdick, Kendall; Baum, Sarah; Bellomy, Melissa; Mueller, Dorothee; MacDonald, Alistair; Chern, Alex; Chrouser, Kristin; Burger, Christie

    2018-03-01

    The concept of clinical workflow borrows from management and leadership principles outside of medicine. The only way to rethink clinical workflow is to understand the neuroscience principles that underlie attention and vigilance. With any implementation to improve practice, there are human factors that can promote or impede progress. Modulating the environment and working as a team to take care of patients is paramount. Clinicians must continually rethink clinical workflow, evaluate progress, and understand that other industries have something to offer. Then, novel approaches can be implemented to take the best care of patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Security and Dependability Solutions for Web Services and Workflows

    NASA Astrophysics Data System (ADS)

    Kokolakis, Spyros; Rizomiliotis, Panagiotis; Benameur, Azzedine; Sinha, Smriti Kumar

    In this chapter we present an innovative approach towards the design and application of Security and Dependability (S&D) solutions for Web services and service-based workflows. Recently, several standards have been published that prescribe S&D solutions for Web services, e.g. OASIS WS-Security. However,the application of these solutions in specific contexts has been proven problematic. We propose a new framework for the application of such solutions based on the SERENITY S&D Pattern concept. An S&D Pattern comprises all the necessary information for the implementation, verification, deployment, and active monitoring of an S&D Solution. Thus, system developers may rely on proven solutions that are dynamically deployed and monitored by the Serenity Runtime Framework. Finally, we further extend this approach to cover the case of executable workflows which are realised through the orchestration of Web services.

  19. Implementation of workflow engine technology to deliver basic clinical decision support functionality

    PubMed Central

    2011-01-01

    Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform. PMID:21477364

  20. A big data approach for climate change indicators processing in the CLIP-C project

    NASA Astrophysics Data System (ADS)

    D'Anca, Alessandro; Conte, Laura; Palazzo, Cosimo; Fiore, Sandro; Aloisio, Giovanni

    2016-04-01

    Defining and implementing processing chains with multiple (e.g. tens or hundreds of) data analytics operators can be a real challenge in many practical scientific use cases such as climate change indicators. This is usually done via scripts (e.g. bash) on the client side and requires climate scientists to take care of, implement and replicate workflow-like control logic aspects (which may be error-prone too) in their scripts, along with the expected application-level part. Moreover, the big amount of data and the strong I/O demand pose additional challenges related to the performance. In this regard, production-level tools for climate data analysis are mostly sequential and there is a lack of big data analytics solutions implementing fine-grain data parallelism or adopting stronger parallel I/O strategies, data locality, workflow optimization, etc. High-level solutions leveraging on workflow-enabled big data analytics frameworks for eScience could help scientists in defining and implementing the workflows related to their experiments by exploiting a more declarative, efficient and powerful approach. This talk will start introducing the main needs and challenges regarding big data analytics workflow management for eScience and will then provide some insights about the implementation of some real use cases related to some climate change indicators on large datasets produced in the context of the CLIP-C project - a EU FP7 project aiming at providing access to climate information of direct relevance to a wide variety of users, from scientists to policy makers and private sector decision makers. All the proposed use cases have been implemented exploiting the Ophidia big data analytics framework. The software stack includes an internal workflow management system, which coordinates, orchestrates, and optimises the execution of multiple scientific data analytics and visualization tasks. Real-time workflow monitoring execution is also supported through a graphical user interface. In order to address the challenges of the use cases, the implemented data analytics workflows include parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, and import/export of datasets in NetCDF format. The use cases have been implemented on a HPC cluster of 8-nodes (16-cores/node) of the Athena Cluster available at the CMCC Supercomputing Centre. Benchmark results will be also presented during the talk.

  1. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    PubMed

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  2. A web accessible scientific workflow system for vadoze zone performance monitoring: design and implementation examples

    NASA Astrophysics Data System (ADS)

    Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.

    2005-12-01

    Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser and uses intuitive functionality, stakeholders with diverse degrees of technical insight can use this system with little or no training.

  3. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  4. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  5. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.

  6. The standard-based open workflow system in GeoBrain (Invited)

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial intelligence technology. The GeoBrain workflow system has been used in multiple Earth science applications, including the monitoring of global agricultural drought, the assessment of flood damage, the derivation of national crop condition and progress information, and the detection of nuclear proliferation facilities and events.

  7. A conceptual framework for managing clinical processes.

    PubMed

    Buffone, G J; Moreau, D

    1997-01-01

    Reengineering of the health care delivery system is underway, as is the transformation of the processes and methods used for recording information describing patient care (i.e., the development of a computer-based record). This report describes the use of object-oriented analysis and design to develop and implement clinical process reengineering as well as the organization of clinical data. In addition, the facility of the proposed framework for implementing workflow computing is discussed.

  8. Game engines and immersive displays

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Destefano, Marc

    2014-02-01

    While virtual reality and digital games share many core technologies, the programming environments, toolkits, and workflows for developing games and VR environments are often distinct. VR toolkits designed for applications in visualization and simulation often have a different feature set or design philosophy than game engines, while popular game engines often lack support for VR hardware. Extending a game engine to support systems such as the CAVE gives developers a unified development environment and the ability to easily port projects, but involves challenges beyond just adding stereo 3D visuals. In this paper we outline the issues involved in adapting a game engine for use with an immersive display system including stereoscopy, tracking, and clustering, and present example implementation details using Unity3D. We discuss application development and workflow approaches including camera management, rendering synchronization, GUI design, and issues specific to Unity3D, and present examples of projects created for a multi-wall, clustered, stereoscopic display.

  9. Implementing bioinformatic workflows within the bioextract server

    USDA-ARS?s Scientific Manuscript database

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  10. Implementing Single Source: The STARBRITE Proof-of-Concept Study

    PubMed Central

    Kush, Rebecca; Alschuler, Liora; Ruggeri, Roberto; Cassells, Sally; Gupta, Nitin; Bain, Landen; Claise, Karen; Shah, Monica; Nahm, Meredith

    2007-01-01

    Objective Inefficiencies in clinical trial data collection cause delays, increase costs, and may reduce clinician participation in medical research. In this proof-of-concept study, we examine the feasibility of using point-of-care data capture for both the medical record and clinical research in the setting of a working clinical trial. We hypothesized that by doing so, we could increase reuse of patient data, eliminate redundant data entry, and minimize disruption to clinic workflow. Design We developed and used a point-of-care electronic data capture system to record data during patient visits. The standards-based system was used for clinical research and to generate the clinic note for the medical record. The system worked in parallel with data collection procedures already in place for an ongoing multicenter clinical trial. Our system was iteratively designed after analyzing case report forms and clinic notes, and observing clinic workflow patterns and business procedures. Existing data standards from CDISC and HL7 were used for database insertion and clinical document exchange. Results Our system was successfully integrated into the clinic environment and used in two live test cases without disrupting existing workflow. Analyses performed during system design yielded detailed information on practical issues affecting implementation of systems that automatically extract, store, and reuse healthcare data. Conclusion Although subject to the limitations of a small feasibility study, our study demonstrates that electronic patient data can be reused for prospective multicenter clinical research and patient care, and demonstrates a need for further development of therapeutic area standards that can facilitate researcher use of healthcare data. PMID:17600107

  11. Innovations in Medication Preparation Safety and Wastage Reduction: Use of a Workflow Management System in a Pediatric Hospital.

    PubMed

    Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara

    2017-01-01

    Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

  12. Improving adherence to the Epic Beacon ambulatory workflow.

    PubMed

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  13. ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows

    NASA Technical Reports Server (NTRS)

    McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush

    2004-01-01

    With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.

  14. A Drupal-Based Collaborative Framework for Science Workflows

    NASA Astrophysics Data System (ADS)

    Pinheiro da Silva, P.; Gandara, A.

    2010-12-01

    Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between scientists about artifacts used or created through scientific processes; and to leverage the knowledge collected within the artifacts and scientific collaborations to support scientific discoveries.

  15. Changes in the cardiac rehabilitation workflow process needed for the implementation of a self-management system.

    PubMed

    Wiggers, Anne-Marieke; Vosbergen, Sandra; Kraaijenhagen, Roderik; Jaspers, Monique; Peek, Niels

    2013-01-01

    E-health interventions are of a growing importance for self-management of chronic conditions. This study aimed to describe the process adaptions that are needed in cardiac rehabilitation (CR) to implement a self-management system, called MyCARDSS. We created a generic workflow model based on interviews and observations at three CR clinics. Subsequently, a workflow model of the ideal situation after implementation of MyCARDSS was created. We found that the implementation will increase the complexity of existing working procedures because 1) not all patients will use MyCARDSS, 2) there is a transfer of tasks and responsibilities from professionals to patients, and 3) information in MyCARDSS needs to be synchronized with the EPR system for professionals.

  16. Interplay between Clinical Guidelines and Organizational Workflow Systems. Experience from the MobiGuide Project.

    PubMed

    Shabo, Amnon; Peleg, Mor; Parimbelli, Enea; Quaglini, Silvana; Napolitano, Carlo

    2016-12-07

    Implementing a decision-support system within a healthcare organization requires integration of clinical domain knowledge with resource constraints. Computer-interpretable guidelines (CIG) are excellent instruments for addressing clinical aspects while business process management (BPM) languages and Workflow (Wf) engines manage the logistic organizational constraints. Our objective is the orchestration of all the relevant factors needed for a successful execution of patient's care pathways, especially when spanning the continuum of care, from acute to community or home care. We considered three strategies for integrating CIGs with organizational workflows: extending the CIG or BPM languages and their engines, or creating an interplay between them. We used the interplay approach to implement a set of use cases arising from a CIG implementation in the domain of Atrial Fibrillation. To provide a more scalable and standards-based solution, we explored the use of Cross-Enterprise Document Workflow Integration Profile. We describe our proof-of-concept implementation of five use cases. We utilized the Personal Health Record of the MobiGuide project to implement a loosely-coupled approach between the Activiti BPM engine and the Picard CIG engine. Changes in the PHR were detected by polling. IHE profiles were used to develop workflow documents that orchestrate cross-enterprise execution of cardioversion. Interplay between CIG and BPM engines can support orchestration of care flows within organizational settings.

  17. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  18. A network collaboration implementing technology to improve medication dispensing and administration in critical access hospitals.

    PubMed

    Wakefield, Douglas S; Ward, Marcia M; Loes, Jean L; O'Brien, John

    2010-01-01

    We report how seven independent critical access hospitals collaborated with a rural referral hospital to standardize workflow policies and procedures while jointly implementing the same health information technologies (HITs) to enhance medication care processes. The study hospitals implemented the same electronic health record, computerized provider order entry, pharmacy information systems, automated dispensing cabinets (ADC), and barcode medication administration systems. We conducted interviews and examined project documents to explore factors underlying the successful implementation of ADC and barcode medication administration across the network hospitals. These included a shared culture of collaboration; strategic sequencing of HIT component implementation; interface among HIT components; strategic placement of ADCs; disciplined use and sharing of workflow analyses linked with HIT applications; planning for workflow efficiencies; acquisition of adequate supply of HIT-related devices; and establishing metrics to monitor HIT use and outcomes.

  19. A task-based support architecture for developing point-of-care clinical decision support systems for the emergency department.

    PubMed

    Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B

    2013-01-01

    The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.

  20. Implementation of Epic Beaker Anatomic Pathology at an Academic Medical Center.

    PubMed

    Blau, John Larry; Wilford, Joseph D; Dane, Susan K; Karandikar, Nitin J; Fuller, Emily S; Jacobsmeier, Debbie J; Jans, Melissa A; Horning, Elisabeth A; Krasowski, Matthew D; Ford, Bradley A; Becker, Kent R; Beranek, Jeanine M; Robinson, Robert A

    2017-01-01

    Beaker is a relatively new laboratory information system (LIS) offered by Epic Systems Corporation as part of its suite of health-care software and bundled with its electronic medical record, EpicCare. It is divided into two modules, Beaker anatomic pathology (Beaker AP) and Beaker Clinical Pathology. In this report, we describe our experience implementing Beaker AP version 2014 at an academic medical center with a go-live date of October 2015. This report covers preimplementation preparations and challenges beginning in September 2014, issues discovered soon after go-live in October 2015, and some post go-live optimizations using data from meetings, debriefings, and the project closure document. We share specific issues that we encountered during implementation, including difficulties with the proposed frozen section workflow, developing a shared specimen source dictionary, and implementation of the standard Beaker workflow in large institution with trainees. We share specific strategies that we used to overcome these issues for a successful Beaker AP implementation. Several areas of the laboratory-required adaptation of the default Beaker build parameters to meet the needs of the workflow in a busy academic medical center. In a few areas, our laboratory was unable to use the Beaker functionality to support our workflow, and we have continued to use paper or have altered our workflow. In spite of several difficulties that required creative solutions before go-live, the implementation has been successful based on satisfaction surveys completed by pathologists and others who use the software. However, optimization of Beaker workflows has continued to be an ongoing process after go-live to the present time. The Beaker AP LIS can be successfully implemented at an academic medical center but requires significant forethought, creative adaptation, and continued shared management of the ongoing product by institutional and departmental information technology staff as well as laboratory managers to meet the needs of the laboratory.

  1. ProcessGene-Connect: SOA Integration between Business Process Models and Enactment Transactions of Enterprise Software Systems

    NASA Astrophysics Data System (ADS)

    Wasser, Avi; Lincoln, Maya

    In recent years, both practitioners and applied researchers have become increasingly interested in methods for integrating business process models and enterprise software systems through the deployment of enabling middleware. Integrative BPM research has been mainly focusing on the conversion of workflow notations into enacted application procedures, and less effort has been invested in enhancing the connectivity between design level, non-workflow business process models and related enactment systems such as: ERP, SCM and CRM. This type of integration is useful at several stages of an IT system lifecycle, from design and implementation through change management, upgrades and rollout. The paper presents an integration method that utilizes SOA for connecting business process models with corresponding enterprise software systems. The method is then demonstrated through an Oracle E-Business Suite procurement process and its ERP transactions.

  2. Evolution of Medication Administration Workflow in Implementing Electronic Health Record System

    ERIC Educational Resources Information Center

    Huang, Yuan-Han

    2013-01-01

    This study focused on the clinical workflow evolutions when implementing the health information technology (HIT). The study especially emphasized on administrating medication when the electronic health record (EHR) systems were adopted at rural healthcare facilities. Mixed-mode research methods, such as survey, observation, and focus group, were…

  3. A Web application for the management of clinical workflow in image‐guided and adaptive proton therapy for prostate cancer treatments

    PubMed Central

    Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-01-01

    Image‐guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X‐rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post‐treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state‐of‐the‐art Web technologies, a domain model closely matching the workflow, a database‐supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model‐View‐Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client‐side technologies, such as jQuery, jQuery Plug‐ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process. PACS number: 87 PMID:26103504

  4. SU-E-T-544: A Radiation Oncology-Specific Multi-Institutional Federated Database: Initial Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrickson, K; Phillips, M; Fishburn, M

    Purpose: To implement a common database structure and user-friendly web-browser based data collection tools across several medical institutions to better support evidence-based clinical decision making and comparative effectiveness research through shared outcomes data. Methods: A consortium of four academic medical centers agreed to implement a federated database, known as Oncospace. Initial implementation has addressed issues of differences between institutions in workflow and types and breadth of structured information captured. This requires coordination of data collection from departmental oncology information systems (OIS), treatment planning systems, and hospital electronic medical records in order to include as much as possible the multi-disciplinary clinicalmore » data associated with a patients care. Results: The original database schema was well-designed and required only minor changes to meet institution-specific data requirements. Mobile browser interfaces for data entry and review for both the OIS and the Oncospace database were tailored for the workflow of individual institutions. Federation of database queries--the ultimate goal of the project--was tested using artificial patient data. The tests serve as proof-of-principle that the system as a whole--from data collection and entry to providing responses to research queries of the federated database--was viable. The resolution of inter-institutional use of patient data for research is still not completed. Conclusions: The migration from unstructured data mainly in the form of notes and documents to searchable, structured data is difficult. Making the transition requires cooperation of many groups within the department and can be greatly facilitated by using the structured data to improve clinical processes and workflow. The original database schema design is critical to providing enough flexibility for multi-institutional use to improve each institution s ability to study outcomes, determine best practices, and support research. The project has demonstrated the feasibility of deploying a federated database environment for research purposes to multiple institutions.« less

  5. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    NASA Astrophysics Data System (ADS)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  6. Quality improvement through implementation of discharge order reconciliation.

    PubMed

    Lu, Yun; Clifford, Pamela; Bjorneby, Andreas; Thompson, Bruce; VanNorman, Samuel; Won, Katie; Larsen, Kevin

    2013-05-01

    A coordinated multidisciplinary process to reduce medication errors related to patient discharges to skilled-nursing facilities (SNFs) is described. After determining that medication errors were a frequent cause of readmission among patients discharged to SNFs, a medical center launched a two-phase quality-improvement project focused on cardiac and medical patients. Phase one of the project entailed a three-month failure modes and effects analysis of existing procedures discharge, followed by the development and pilot testing of a multidisciplinary, closed-loop workflow process involving staff and resident physicians, clinical nurse coordinators, and clinical pharmacists. During pilot testing of the new workflow process, the rate of discharge medication errors involving SNF patients was tracked, and data on medication-related readmissions in a designated intervention group (n = 87) and a control group of patients (n = 1893) discharged to SNFs via standard procedures during a nine-month period were collected, with the data stratified using severity of illness (SOI) classification. Analysis of the collected data indicated a cumulative 30-day medication-related readmission rate for study group patients in the minor, moderate, and major SOI categories of 5.4% (4 of 74 patients), compared with a rate of 9.5% (169 of 1780 patients) in the control group. In phase 2 of the project, the revised SNF discharge medication reconciliation procedure was implemented throughout the hospital; since hospitalwide implementation of the new workflow, the readmission rate for SNF patients has been maintained at about 6.7%. Implementing a standardized discharge order reconciliation process that includes pharmacists led to decreased readmission rates and improved care for patients discharged to SNFs.

  7. Military Interoperable Digital Hospital Testbed (MIDHT)

    DTIC Science & Technology

    2015-12-01

    and Analyze the resulting technological impact on medication errors, pharmacists ’ productivity, nurse satisfactions/workflow and patient...medication errors, pharmacists productivity, nurse satisfactions/workflow and patient satisfaction. 1.1.1 Pharmacy Robotics Implementation...1.2 Research and analyze the resulting technological impact on medication errors, pharmacist productivity, nurse satisfaction/workflow and patient

  8. Case Report: Activity Diagrams for Integrating Electronic Prescribing Tools into Clinical Workflow

    PubMed Central

    Johnson, Kevin B.; FitzHenry, Fern

    2006-01-01

    To facilitate the future implementation of an electronic prescribing system, this case study modeled prescription management processes in various primary care settings. The Vanderbilt e-prescribing design team conducted initial interviews with clinic managers, physicians and nurses, and then represented the sequences of steps carried out to complete prescriptions in activity diagrams. The diagrams covered outpatient prescribing for patients during a clinic visit and between clinic visits. Practice size, practice setting, and practice specialty type influenced the prescribing processes used. The model developed may be useful to others engaged in building or tailoring an e-prescribing system to meet the specific workflows of various clinic settings. PMID:16622168

  9. A Model of Workflow Composition for Emergency Management

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  10. Toward the Design of Personalized Continuum Surgical Robots.

    PubMed

    Morimoto, Tania K; Greer, Joseph D; Hawkes, Elliot W; Hsieh, Michael H; Okamura, Allison M

    2018-05-31

    Robot-assisted minimally invasive surgical systems enable procedures with reduced pain, recovery time, and scarring compared to traditional surgery. While these improvements benefit a large number of patients, safe access to diseased sites is not always possible for specialized patient groups, including pediatric patients, due to their anatomical differences. We propose a patient-specific design paradigm that leverages the surgeon's expertise to design and fabricate robots based on preoperative medical images. The components of the patient-specific robot design process are a virtual reality design interface enabling the surgeon to design patient-specific tools, 3-D printing of these tools with a biodegradable polyester, and an actuation and control system for deployment. The designed robot is a concentric tube robot, a type of continuum robot constructed from precurved, elastic, nesting tubes. We demonstrate the overall patient-specific design workflow, from preoperative images to physical implementation, for an example clinical scenario: nonlinear renal access to a pediatric kidney. We also measure the system's behavior as it is deployed through real and artificial tissue. System integration and successful benchtop experiments in ex vivo liver and in a phantom patient model demonstrate the feasibility of using a patient-specific design workflow to plan, fabricate, and deploy personalized, flexible continuum robots.

  11. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  12. Implementation of a departmental picture archiving and communication system: a productivity and cost analysis.

    PubMed

    Macyszyn, Luke; Lega, Brad; Bohman, Leif-Erik; Latefi, Ahmad; Smith, Michelle J; Malhotra, Neil R; Welch, William; Grady, Sean M

    2013-09-01

    Digital radiology enhances productivity and results in long-term cost savings. However, the viewing, storage, and sharing of outside imaging studies on compact discs at ambulatory offices and hospitals pose a number of unique challenges to a surgeon's efficiency and clinical workflow. To improve the efficiency and clinical workflow of an academic neurosurgical practice when evaluating patients with outside radiological studies. Open-source software and commercial hardware were used to design and implement a departmental picture archiving and communications system (PACS). The implementation of a departmental PACS system significantly improved productivity and enhanced collaboration in a variety of clinical settings. Using published data on the rate of information technology problems associated with outside studies on compact discs, this system produced a cost savings ranging from $6250 to $33600 and from $43200 to $72000 for 2 cohorts, urgent transfer and spine clinic patients, respectively, therefore justifying the costs of the system in less than a year. The implementation of a departmental PACS system using open-source software is straightforward and cost-effective and results in significant gains in surgeon productivity when evaluating patients with outside imaging studies.

  13. Applying lean principles to continuous renal replacement therapy processes.

    PubMed

    Benfield, C Brett; Brummond, Philip; Lucarotti, Andrew; Villarreal, Maria; Goodwin, Adam; Wonnacott, Rob; Talley, Cheryl; Heung, Michael

    2015-02-01

    The application of lean principles to continuous renal replacement therapy (CRRT) processes in an academic medical center is described. A manual audit over six consecutive weeks revealed that 133 5-L bags of CRRT solution were discarded after being dispensed from pharmacy but before clinical use. Lean principles were used to examine the workflow for CRRT preparation and develop and implement an intervention. An educational program was developed to encourage and enhance direct communication between nursing and pharmacy about changes in a patient's condition or CRRT order. It was through this education program that the reordering workflow shifted from nurses to pharmacy technicians. The primary outcome was the number of CRRT solution bags delivered in the preintervention and postintervention periods. Nurses and pharmacy technicians were surveyed to determine their satisfaction with the workflow change. After implementation of lean principles, the mean number of CRRT solution bags dispensed per day of CRRT decreased substantially. Respondents' overall satisfaction with the CRRT solution preparation process increased during the postintervention period, and the satisfaction scores for each individual component of the workflow after implementation of lean principles. The decreased solution waste resulted in projected annual cost savings exceeding $70,000 in product alone. The use of lean principles to identify medication waste in the CRRT workflow and implementation of an intervention to shift the workload from intensive care unit nurses to pharmacy technicians led to reduced CRRT solution waste, improved efficiency of CRRT workflow, and increased satisfaction among staff. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  14. Maximum unbiased validation (MUV) data sets for virtual screening based on PubChem bioactivity data.

    PubMed

    Rohrer, Sebastian G; Baumann, Knut

    2009-02-01

    Refined nearest neighbor analysis was recently introduced for the analysis of virtual screening benchmark data sets. It constitutes a technique from the field of spatial statistics and provides a mathematical framework for the nonparametric analysis of mapped point patterns. Here, refined nearest neighbor analysis is used to design benchmark data sets for virtual screening based on PubChem bioactivity data. A workflow is devised that purges data sets of compounds active against pharmaceutically relevant targets from unselective hits. Topological optimization using experimental design strategies monitored by refined nearest neighbor analysis functions is applied to generate corresponding data sets of actives and decoys that are unbiased with regard to analogue bias and artificial enrichment. These data sets provide a tool for Maximum Unbiased Validation (MUV) of virtual screening methods. The data sets and a software package implementing the MUV design workflow are freely available at http://www.pharmchem.tu-bs.de/lehre/baumann/MUV.html.

  15. Leaf LIMS: A Flexible Laboratory Information Management System with a Synthetic Biology Focus.

    PubMed

    Craig, Thomas; Holland, Richard; D'Amore, Rosalinda; Johnson, James R; McCue, Hannah V; West, Anthony; Zulkower, Valentin; Tekotte, Hille; Cai, Yizhi; Swan, Daniel; Davey, Robert P; Hertz-Fowler, Christiane; Hall, Anthony; Caddick, Mark

    2017-12-15

    This paper presents Leaf LIMS, a flexible laboratory information management system (LIMS) designed to address the complexity of synthetic biology workflows. At the project's inception there was a lack of a LIMS designed specifically to address synthetic biology processes, with most systems focused on either next generation sequencing or biobanks and clinical sample handling. Leaf LIMS implements integrated project, item, and laboratory stock tracking, offering complete sample and construct genealogy, materials and lot tracking, and modular assay data capture. Hence, it enables highly configurable task-based workflows and supports data capture from project inception to completion. As such, in addition to it supporting synthetic biology it is ideal for many laboratory environments with multiple projects and users. The system is deployed as a web application through Docker and is provided under a permissive MIT license. It is freely available for download at https://leaflims.github.io .

  16. Clinical implementation of 3D printing in the construction of patient specific bolus for electron beam radiotherapy for non-melanoma skin cancer.

    PubMed

    Canters, Richard A; Lips, Irene M; Wendling, Markus; Kusters, Martijn; van Zeeland, Marianne; Gerritsen, Rianne M; Poortmans, Philip; Verhoef, Cornelia G

    2016-10-01

    Creating an individualized tissue equivalent material build-up (i.e. bolus) for electron beam radiation therapy is complex and highly labour-intensive. We implemented a new clinical workflow in which 3D printing technology is used to create the bolus. A patient-specific bolus is designed in the treatment planning system (TPS) and a shell around it is created in the TPS. The shell is printed and subsequently filled with silicone rubber to make the bolus. Before clinical implementation we performed a planning study with 11 patients to evaluate the difference in tumour coverage between the designed 3D-print bolus and the clinically delivered plan with manually created bolus. For the first 15 clinical patients a second CT scan with the 3D-print bolus was performed to verify the geometrical accuracy. The planning study showed that the V85% of the CTV was on average 97% (3D-print) vs 88% (conventional). Geometric comparison of the 3D-print bolus to the originally contoured bolus showed a high similarity (DSC=0.89). The dose distributions on the second CT scan with the 3D print bolus in position showed only small differences in comparison to the original planning CT scan. The implemented workflow is feasible, patient friendly, safe, and results in high quality dose distributions. This new technique increases time efficiency. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Streamlining the Capstone Process: A Time-Saving Approval System for Graduate Theses/Projects

    ERIC Educational Resources Information Center

    Grooms, James; Kline, Douglas; Cummings, Jeffrey

    2016-01-01

    Capstones have become an integral part of many information systems programs, both at the undergraduate and graduate level. One of the challenges can be tracking the process from the start of the capstone to completion. This paper describes the analysis, design and implementation of a web application for the approval workflow of a master's program…

  18. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  19. Design and implementation of an automated compound management system in support of lead optimization.

    PubMed

    Quintero, Catherine; Kariv, Ilona

    2009-06-01

    To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process.

  20. Leveraging workflow control patterns in the domain of clinical practice guidelines.

    PubMed

    Kaiser, Katharina; Marcos, Mar

    2016-02-10

    Clinical practice guidelines (CPGs) include recommendations describing appropriate care for the management of patients with a specific clinical condition. A number of representation languages have been developed to support executable CPGs, with associated authoring/editing tools. Even with tool assistance, authoring of CPG models is a labor-intensive task. We aim at facilitating the early stages of CPG modeling task. In this context, we propose to support the authoring of CPG models based on a set of suitable procedural patterns described in an implementation-independent notation that can be then semi-automatically transformed into one of the alternative executable CPG languages. We have started with the workflow control patterns which have been identified in the fields of workflow systems and business process management. We have analyzed the suitability of these patterns by means of a qualitative analysis of CPG texts. Following our analysis we have implemented a selection of workflow patterns in the Asbru and PROforma CPG languages. As implementation-independent notation for the description of patterns we have chosen BPMN 2.0. Finally, we have developed XSLT transformations to convert the BPMN 2.0 version of the patterns into the Asbru and PROforma languages. We showed that although a significant number of workflow control patterns are suitable to describe CPG procedural knowledge, not all of them are applicable in the context of CPGs due to their focus on single-patient care. Moreover, CPGs may require additional patterns not included in the set of workflow control patterns. We also showed that nearly all the CPG-suitable patterns can be conveniently implemented in the Asbru and PROforma languages. Finally, we demonstrated that individual patterns can be semi-automatically transformed from a process specification in BPMN 2.0 to executable implementations in these languages. We propose a pattern and transformation-based approach for the development of CPG models. Such an approach can form the basis of a valid framework for the authoring of CPG models. The identification of adequate patterns and the implementation of transformations to convert patterns from a process specification into different executable implementations are the first necessary steps for our approach.

  1. Workflows for Full Waveform Inversions

    NASA Astrophysics Data System (ADS)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  2. Preclinical Feasibility of a Technology Framework for MRI-guided Iliac Angioplasty

    PubMed Central

    Rube, Martin A.; Fernandez-Gutierrez, Fabiola; Cox, Benjamin F.; Holbrook, Andrew B.; Houston, J. Graeme; White, Richard D.; McLeod, Helen; Fatahi, Mahsa; Melzer, Andreas

    2015-01-01

    Purpose Interventional MRI has significant potential for image guidance of iliac angioplasty and related vascular procedures. A technology framework with in-room image display, control, communication and MRI-guided intervention techniques was designed and tested for its potential to provide safe, fast and efficient MRI-guided angioplasty of the iliac arteries. Methods A 1.5T MRI scanner was adapted for interactive imaging during endovascular procedures using new or modified interventional devices such as guidewires and catheters. A perfused vascular phantom was used for testing. Pre-, intra- and post-procedural visualization and measurement of vascular morphology and flow was implemented. A detailed analysis of X-Ray fluoroscopic angiography workflow was conducted and applied. Two interventional radiologists and one physician in training performed 39 procedures. All procedures were timed and analyzed. Results MRI-guided iliac angioplasty procedures were successfully performed with progressive adaptation of techniques and workflow. The workflow, setup and protocol enabled a reduction in table time for a dedicated MRI-guided procedure to 6 min 33 s with a mean procedure time of 9 min 2 s, comparable to the mean procedure time of 8 min 42 s for the standard X-Ray guided procedure. Conclusions MRI-guided iliac vascular interventions were found to be feasible and practical using this framework and optimized workflow. In particular the real-time flow analysis was found to be helpful for pre- and post-interventional assessments. Design optimization of the catheters and in vivo experiments are required before clinical evaluation. PMID:25102933

  3. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  4. Application of Six Sigma methodology to a diagnostic imaging process.

    PubMed

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  5. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  6. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    PubMed

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Workflow computing. Improving management and efficiency of pathology diagnostic services.

    PubMed

    Buffone, G J; Moreau, D; Beck, J R

    1996-04-01

    Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.

  8. Disruption of Radiologist Workflow.

    PubMed

    Kansagra, Akash P; Liu, Kevin; Yu, John-Paul J

    2016-01-01

    The effect of disruptions has been studied extensively in surgery and emergency medicine, and a number of solutions-such as preoperative checklists-have been implemented to enforce the integrity of critical safety-related workflows. Disruptions of the highly complex and cognitively demanding workflow of modern clinical radiology have only recently attracted attention as a potential safety hazard. In this article, we describe the variety of disruptions that arise in the reading room environment, review approaches that other specialties have taken to mitigate workflow disruption, and suggest possible solutions for workflow improvement in radiology. Copyright © 2015 Mosby, Inc. All rights reserved.

  9. Semantics-enabled service discovery framework in the SIMDAT pharma grid.

    PubMed

    Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert

    2008-03-01

    We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.

  10. Digital pathology in clinical use: where are we now and what is holding us back?

    PubMed

    Griffin, Jon; Treanor, Darren

    2017-01-01

    Whole slide imaging is being used increasingly in research applications and in frozen section, consultation and external quality assurance practice. Digital pathology, when integrated with other digital tools such as barcoding, specimen tracking and digital dictation, can be integrated into the histopathology workflow, from specimen accession to report sign-out. These elements can bring about improvements in the safety, quality and efficiency of a histopathology department. The present paper reviews the evidence for these benefits. We then discuss the challenges of implementing a fully digital pathology workflow, including the regulatory environment, validation of whole slide imaging and the evidence for the design of a digital pathology workstation. © 2016 John Wiley & Sons Ltd.

  11. Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur

    2010-01-01

    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408

  12. An Intelligent Automation Platform for Rapid Bioprocess Design.

    PubMed

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  13. An Intelligent Automation Platform for Rapid Bioprocess Design

    PubMed Central

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  14. Integrating Electronic Health Record Competencies into Undergraduate Health Informatics Education.

    PubMed

    Borycki, Elizabeth M; Griffith, Janessa; Kushniruk, Andre W

    2016-01-01

    In this paper we report on our findings arising from a qualitative, interview study of students' experiences in an undergraduate health informatics program. Our findings suggest that electronic health record competencies need to be integrated into an undergraduate curriculum. Participants suggested that there is a need to educate students about the use of the EHR, followed by best practices around interface design, workflow, and implementation with this work culminating in students spearheading the design of the technology as part of their educational program of study.

  15. Design and implementation of an internet-based electrical engineering laboratory.

    PubMed

    He, Zhenlei; Shen, Zhangbiao; Zhu, Shanan

    2014-09-01

    This paper describes an internet-based electrical engineering laboratory (IEE-Lab) with virtual and physical experiments at Zhejiang University. In order to synthesize the advantages of both experiment styles, the IEE-Lab is come up with Client/Server/Application framework and combines the virtual and physical experiments. The design and workflow of IEE-Lab are introduced. The analog electronic experiment is taken as an example to show Flex plug-in design, data communication based on XML (Extensible Markup Language), experiment simulation modeled by Modelica and control terminals' design. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  16. The ATLAS Public Web Pages: Online Management of HEP External Communication Content

    NASA Astrophysics Data System (ADS)

    Goldfarb, S.; Marcelloni, C.; Eli Phoboo, A.; Shaw, K.

    2015-12-01

    The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal [1] content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and the enforcement of a well-defined visual identity.

  17. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow.

    PubMed

    Walsh, Kristin E; Chui, Michelle Anne; Kieser, Mara A; Williams, Staci M; Sutter, Susan L; Sutter, John G

    2011-01-01

    To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign.

  18. Engaging primary care patients to use a patient-centered personal health record.

    PubMed

    Krist, Alex H; Woolf, Steven H; Bello, Ghalib A; Sabo, Roy T; Longo, Daniel R; Kashiri, Paulette; Etz, Rebecca S; Loomis, John; Rothemich, Stephen F; Peele, J Eric; Cohn, Jeffrey

    2014-01-01

    Health care leaders encourage clinicians to offer portals that enable patients to access personal health records, but implementation has been a challenge. Although large integrated health systems have promoted use through costly advertising campaigns, other implementation methods are needed for small to medium-sized practices where most patients receive their care. We conducted a mixed methods assessment of a proactive implementation strategy for a patient portal (an interactive preventive health record [IPHR]) offered by 8 primary care practices. The practices implemented a series of learning collaboratives with practice champions and redesigned workflow to integrate portal use into care. Practice implementation strategies, portal use, and factors influencing use were assessed prospectively. A proactive and customized implementation strategy designed by practices resulted in 25.6% of patients using the IPHR, with the rate increasing 1.0% per month over 31 months. Fully 23.5% of IPHR users signed up within 1 day of their office visit. Older patients and patients with comorbidities were more likely to use the IPHR, but blacks and Hispanics were less likely. Older age diminished as a factor after adjusting for comorbidities. Implementation by practice varied considerably (from 22.1% to 27.9%, P <.001) based on clinician characteristics and workflow innovations adopted by practices to enhance uptake. By directly engaging patients to use a portal and supporting practices to integrate use into care, primary care practices can match or potentially surpass the usage rates achieved by large health systems. © 2014 Annals of Family Medicine, Inc.

  19. Communication spaces

    PubMed Central

    Coiera, Enrico

    2014-01-01

    Background and objective Annotations to physical workspaces such as signs and notes are ubiquitous. When densely annotated, work areas become communication spaces. This study aims to characterize the types and purpose of such annotations. Methods A qualitative observational study was undertaken in two wards and the radiology department of a 440-bed metropolitan teaching hospital. Images were purposefully sampled; 39 were analyzed after excluding inferior images. Results Annotation functions included signaling identity, location, capability, status, availability, and operation. They encoded data, rules or procedural descriptions. Most aggregated into groups that either created a workflow by referencing each other, supported a common workflow without reference to each other, or were heterogeneous, referring to many workflows. Higher-level assemblies of such groupings were also observed. Discussion Annotations make visible the gap between work done and the capability of a space to support work. Annotations are repairs of an environment, improving fitness for purpose, fixing inadequacy in design, or meeting emergent needs. Annotations thus record the missing information needed to undertake tasks, typically added post-implemented. Measuring annotation levels post-implementation could help assess the fit of technology to task. Physical and digital spaces could meet broader user needs by formally supporting user customization, ‘programming through annotation’. Augmented reality systems could also directly support annotation, addressing existing information gaps, and enhancing work with context sensitive annotation. Conclusions Communication spaces offer a model of how work unfolds. Annotations make visible local adaptation that makes technology fit for purpose post-implementation and suggest an important role for annotatable information systems and digital augmentation of the physical environment. PMID:24005797

  20. Utilization of building information modeling in infrastructure’s design and construction

    NASA Astrophysics Data System (ADS)

    Zak, Josef; Macadam, Helen

    2017-09-01

    Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.

  1. Digital transformation in home care. A case study.

    PubMed

    Bennis, Sandy; Costanzo, Diane; Flynn, Ann Marie; Reidy, Agatha; Tronni, Catherine

    2007-01-01

    Simply implementing software and technology does not assure that an organization's targeted clinical and financial goals will be realized. No longer is it possible to roll out a new system--by solely providing end user training and overlaying it on top of already inefficient workflows and outdated roles--and know with certainty that targets will be met. At Virtua Health's Home Care, based in south New Jersey, implementation of their electronic system initially followed this more traditional approach. Unable to completely attain their earlier identified return on investment, they enlisted the help of a new role within their health system, that of the nurse informaticist. Knowledgeable in complex clinical processes and not bound by the technology at hand, the informaticist analyzed physical workflow, digital workflow, roles and physical layout. Leveraging specific tools such as change acceleration, workouts and LEAN, the informaticist was able to redesign workflow and support new levels of functionality. This article provides a view from the "finish line", recounting how this role worked with home care to assimilate information delivery into more efficient processes and align resources to support the new workflow, ultimately achieving real tangible returns.

  2. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966

  3. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.

  4. Essential attributes identified in the design of a Laboratory Information Management System for a high throughput siRNA screening laboratory.

    PubMed

    Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey

    2011-11-01

    In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers

  5. Computer imaging and workflow systems in the business office.

    PubMed

    Adams, W T; Veale, F H; Helmick, P M

    1999-05-01

    Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.

  6. Beyond usability: designing effective technology implementation systems to promote patient safety.

    PubMed

    Karsh, B-T

    2004-10-01

    Evidence is emerging that certain technologies such as computerized provider order entry may reduce the likelihood of patient harm. However, many technologies that should reduce medical errors have been abandoned because of problems with their design, their impact on workflow, and general dissatisfaction with them by end users. Patient safety researchers have therefore looked to human factors engineering for guidance on how to design technologies to be usable (easy to use) and useful (improving job performance, efficiency, and/or quality). While this is a necessary step towards improving the likelihood of end user satisfaction, it is still not sufficient. Human factors engineering research has shown that the manner in which technologies are implemented also needs to be designed carefully if benefits are to be realized. This paper reviews the theoretical knowledge on what leads to successful technology implementation and how this can be translated into specifically designed processes for successful technology change. The literature on diffusion of innovations, technology acceptance, organisational justice, participative decision making, and organisational change is reviewed and strategies for promoting successful implementation are provided. Given the rapid and ever increasing pace of technology implementation in health care, it is critical for the science of technology implementation to be understood and incorporated into efforts to improve patient safety.

  7. Development of a user customizable imaging informatics-based intelligent workflow engine system to enhance rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent

    2014-03-01

    Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.

  8. Design of admission medication reconciliation technology: a human factors approach to requirements and prototyping.

    PubMed

    Lesselroth, Blake J; Adams, Kathleen; Tallett, Stephanie; Wood, Scott D; Keeling, Amy; Cheng, Karen; Church, Victoria L; Felder, Robert; Tran, Hanna

    2013-01-01

    Our objectives were to (1) develop an in-depth understanding of the workflow and information flow in medication reconciliation, and (2) design medication reconciliation support technology using a combination of rapid-cycle prototyping and human-centered design. Although medication reconciliation is a national patient safety goal, limitations both of physical environment and in workflow can make it challenging to implement durable systems. We used several human factors techniques to gather requirements and develop a new process to collect a medication history at hospital admission. We completed an ethnography and time and motion analysis of pharmacists in order to illustrate the processes used to reconcile medications. We then used the requirements to design prototype multimedia software for collecting a bedside medication history. We observed how pharmacists incorporated the technology into their physical environment and documented usability issues. Admissions occurred in three phases: (1) list compilation, (2) order processing, and (3) team coordination. Current medication reconciliation processes at the hospital average 19 minutes to complete and do not include a bedside interview. Use of our technology during a bedside interview required an average of 29 minutes. The software represents a viable proof-of-concept to automate parts of history collection and enhance patient communication. However, we discovered several usability issues that require attention. We designed a patient-centered technology to enhance how clinicians collect a patient's medication history. By using multiple human factors methods, our research team identified system themes and design constraints that influence the quality of the medication reconciliation process and implementation effectiveness of new technology. Evidence-based design, human factors, patient-centered care, safety, technology.

  9. Potential of knowledge discovery using workflows implemented in the C3Grid

    NASA Astrophysics Data System (ADS)

    Engel, Thomas; Fink, Andreas; Ulbrich, Uwe; Schartner, Thomas; Dobler, Andreas; Fritzsch, Bernadette; Hiller, Wolfgang; Bräuer, Benny

    2013-04-01

    With the increasing number of climate simulations, reanalyses and observations, new infrastructures to search and analyse distributed data are necessary. In recent years, the Grid architecture became an important technology to fulfill these demands. For the German project "Collaborative Climate Community Data and Processing Grid" (C3Grid) computer scientists and meteorologists developed a system that offers its users a webinterface to search and download climate data and use implemented analysis tools (called workflows) to further investigate them. In this contribution, two workflows that are implemented in the C3Grid architecture are presented: the Cyclone Tracking (CT) and Stormtrack workflow. They shall serve as an example on how to perform numerous investigations on midlatitude winterstorms on a large amount of analysis and climate model data without having an insight into the data source, program code and a low-to-moderate understanding of the theortical background. CT is based on the work of Murray and Simmonds (1991) to identify and track local minima in the mean sea level pressure (MSLP) field of the selected dataset. Adjustable thresholds for the curvature of the isobars as well as the minimum lifetime of a cyclone allow the distinction of weak subtropical heat low systems and stronger midlatitude cyclones e.g. in the Northern Atlantic. The user gets the resulting track data including statistics about the track density, average central pressure, average central curvature, cyclogenesis and cyclolysis as well as pre-built visualizations of these results. Stormtrack calculates the 2.5-6 day bandpassfiltered standard deviation of the geopotential height on a selected pressure level. Although this workflow needs much less computational effort compared to CT it shows structures that are in good agreement with the track density of the CT workflow. To what extent changes in the mid-level tropospheric storm track are reflected in trough density and intensity alteration of surface cyclones. A specific feature of C3Grid is the flexible Workflow Scheduling Service (WSS) which also allows for automated nightly analysis runs of CT, Stormtrack, etc. with different input parameter sets. The statistical results of these workflows can be accumulated afterwards by a scheduled final analysis step, thereby providing a tool for data intensive analytics for the massive amounts of climate model data accessible through C3Grid. First tests with these automated analysis workflows show promising results to speed up the investigation of high volume modeling data. This example is relevant to the thorough analysis of future changes in storminess in Europe and is just one example of the potential of knowledge discovery using automated workflows implemented in the C3Grid architecture.

  10. The Impact of Electronic Health Records on Workflow and Financial Measures in Primary Care Practices

    PubMed Central

    Fleming, Neil S; Becker, Edmund R; Culler, Steven D; Cheng, Dunlei; McCorkle, Russell; da Graca, Briget; Ballard, David J

    2014-01-01

    Objective To estimate a commercially available ambulatory electronic health record’s (EHR’s) impact on workflow and financial measures. Data Sources/Study Setting Administrative, payroll, and billing data were collected for 26 primary care practices in a fee-for-service network that rolled out an EHR on a staggered schedule from June 2006 through December 2008. Study Design An interrupted time series design was used. Staffing, visit intensity, productivity, volume, practice expense, payments received, and net income data were collected monthly for 2004–2009. Changes were evaluated 1–6, 7–12, and >12 months postimplementation. Data Collection/Extraction Methods Data were accessed through a SQLserver database, transformed into SAS®, and aggregated by practice. Practice-level data were divided by full-time physician equivalents for comparisons across practices by month. Principal Findings Staffing and practice expenses increased following EHR implementation (3 and 6 percent after 12 months). Productivity, volume, and net income decreased initially but recovered to/close to preimplementation levels after 12 months. Visit intensity did not change significantly, and a secular trend offset the decrease in payments received. Conclusions Expenses increased and productivity decreased following EHR implementation, but not as much or as persistently as might be expected. Longer term effects still need to be examined. PMID:24359533

  11. Radiology Workflow Dynamics: How Workflow Patterns Impact Radiologist Perceptions of Workplace Satisfaction.

    PubMed

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum; Field, Aaron; Wiegmann, Douglas; Yu, John-Paul J

    2017-04-01

    The study aimed to assess perceptions of reading room workflow and the impact separating image-interpretive and nonimage-interpretive task workflows can have on radiologist perceptions of workplace disruptions, workload, and overall satisfaction. A 14-question survey instrument was developed to measure radiologist perceptions of workplace interruptions, satisfaction, and workload prior to and following implementation of separate image-interpretive and nonimage-interpretive reading room workflows. The results were collected over 2 weeks preceding the intervention and 2 weeks following the end of the intervention. The results were anonymized and analyzed using univariate analysis. A total of 18 people responded to the preintervention survey: 6 neuroradiology fellows and 12 attending neuroradiologists. Fifteen people who were then present for the 1-month intervention period responded to the postintervention survey. Perceptions of workplace disruptions, image interpretation, quality of trainee education, ability to perform nonimage-interpretive tasks, and quality of consultations (P < 0.0001) all improved following the intervention. Mental effort and workload also improved across all assessment domains, as did satisfaction with quality of image interpretation and consultative work. Implementation of parallel dedicated image-interpretive and nonimage-interpretive workflows may improve markers of radiologist perceptions of workplace satisfaction. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  12. Managing the life cycle of electronic clinical documents.

    PubMed

    Payne, Thomas H; Graham, Gail

    2006-01-01

    To develop a model of the life cycle of clinical documents from inception to use in a person's medical record, including workflow requirements from clinical practice, local policy, and regulation. We propose a model for the life cycle of clinical documents as a framework for research on documentation within electronic medical record (EMR) systems. Our proposed model includes three axes: the stages of the document, the roles of those involved with the document, and the actions those involved may take on the document at each stage. The model includes the rules to describe who (in what role) can perform what actions on the document, and at what stages they can perform them. Rules are derived from needs of clinicians, and requirements of hospital bylaws and regulators. Our model encompasses current practices for paper medical records and workflow in some EMR systems. Commercial EMR systems include methods for implementing document workflow rules. Workflow rules that are part of this model mirror functionality in the Department of Veterans Affairs (VA) EMR system where the Authorization/ Subscription Utility permits document life cycle rules to be written in English-like fashion. Creating a model of the life cycle of clinical documents serves as a framework for discussion of document workflow, how rules governing workflow can be implemented in EMR systems, and future research of electronic documentation.

  13. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications

    PubMed Central

    Stoppe, Jannis; Drechsler, Rolf

    2015-01-01

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC. PMID:25946632

  14. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications.

    PubMed

    Stoppe, Jannis; Drechsler, Rolf

    2015-05-04

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC.

  15. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  16. Characterising and correcting batch variation in an automated direct infusion mass spectrometry (DIMS) metabolomics workflow.

    PubMed

    Kirwan, J A; Broadhurst, D I; Davidson, R L; Viant, M R

    2013-06-01

    Direct infusion mass spectrometry (DIMS)-based untargeted metabolomics measures many hundreds of metabolites in a single experiment. While every effort is made to reduce within-experiment analytical variation in untargeted metabolomics, unavoidable sources of measurement error are introduced. This is particularly true for large-scale multi-batch experiments, necessitating the development of robust workflows that minimise batch-to-batch variation. Here, we conducted a purpose-designed, eight-batch DIMS metabolomics study using nanoelectrospray (nESI) Fourier transform ion cyclotron resonance mass spectrometric analyses of mammalian heart extracts. First, we characterised the intrinsic analytical variation of this approach to determine whether our existing workflows are fit for purpose when applied to a multi-batch investigation. Batch-to-batch variation was readily observed across the 7-day experiment, both in terms of its absolute measurement using quality control (QC) and biological replicate samples, as well as its adverse impact on our ability to discover significant metabolic information within the data. Subsequently, we developed and implemented a computational workflow that includes total-ion-current filtering, QC-robust spline batch correction and spectral cleaning, and provide conclusive evidence that this workflow reduces analytical variation and increases the proportion of significant peaks. We report an overall analytical precision of 15.9%, measured as the median relative standard deviation (RSD) for the technical replicates of the biological samples, across eight batches and 7 days of measurements. When compared against the FDA guidelines for biomarker studies, which specify an RSD of <20% as an acceptable level of precision, we conclude that our new workflows are fit for purpose for large-scale, high-throughput nESI DIMS metabolomics studies.

  17. Querying Provenance Information: Basic Notions and an Example from Paleoclimate Reconstruction

    NASA Astrophysics Data System (ADS)

    Stodden, V.; Ludaescher, B.; Bocinsky, K.; Kintigh, K.; Kohler, T.; McPhillips, T.; Rush, J.

    2016-12-01

    Computational models are used to reconstruct and explain past environments and to predict likely future environments. For example, Bocinsky and Kohler have performed a 2,000-year reconstruction of the rain-fed maize agricultural niche in the US Southwest. The resulting academic publications not only contain traditional method descriptions, figures, etc. but also links to code and data for basic transparency and reproducibility. Examples include ResearchCompendia.org and the new project "Merging Science and Cyberinfrastructure Pathways: The Whole Tale." Provenance information provides a further critical element to understand a published study and to possibly extend or challenge the findings of the original authors. We present different notions and uses of provenance information using a computational archaeology example, e.g., the common use of "provenance for others" (for transparency and reproducibility), but also the more elusive but equally important use of "provenance for self". To this end, we distinguish prospective provenance (a.k.a. workflow) from retrospective provenance (a.k.a. data lineage) and show how combinations of both forms of provenance can be used to answer different kinds of important questions about a workflow and its execution. Since many workflows are developed using scripting or special purpose languages such as Python and R, we employ an approach and toolkit called YesWorkflow that brings provenance modeling, capture, and querying into the realm of scripting. YesWorkflow employs the basic W3C PROV standard, as well as the ProvONE extension for sharing and exchanging retrospective and prospective provenance information, respectively. Finally, we argue that the utility of provenance information should be maximized by developing different kinds provenance questions and queries during the early phases of computational workflow design and implementation.

  18. Emailing Drones: From Design to Test Range to ARS Offices and into the Field

    NASA Astrophysics Data System (ADS)

    Fuka, D. R.; Singer, S.; Rodriguez, R., III; Collick, A.; Cunningham, A.; Kleinman, P. J. A.; Manoukis, N. C.; Matthews, B.; Ralston, T.; Easton, Z. M.

    2017-12-01

    Unmanned aerial vehicles (UAVs or `drones') are one of the newest tools available for collecting geo- and biological-science data in the field, though today's commercial drones only come in a small range of options. While scientific research has benefitted from the enhanced topographic and surface characterization data that UAVs can provide through traditional image based remote sensing techniques, drones have significantly greater mission-specific potential than are currently utilized. The reasons for this under-utilization are twofold, 1) because with their broad capabilities comes the need to be careful in implementation, and as such, FAA and other regulatory agencies around the world have blanket regulations that can inhibit new designs from being implemented, and 2) current multi-mission-multi-payload commercial drones have to be over-designed to compensate for the fact that they are very difficult to stabilize for multiple payloads, leading to a much higher cost than necessary. For this project, we explore and demonstrate a workflow to optimize the design, testing, approval, and implementation of embarrassingly inexpensive mission specific drones, with two use cases. The first will follow the process from design (at VTech and UH Hilo) to field implementation (by USDA-ARS in PA and Extension in VA) of several custom water quality monitoring drones, printed on demand at ARS and Extension offices after testing at the Pan-Pacific UAS Test Range Complex (PPUTRC). This type of customized drone can allow for an increased understanding in the transition from non-point source to point source agri-chemical and pollutant transport in watershed systems. The second use case will follow the same process, resulting in customized drones with pest specific traps built into the design. This class of customized drone can facilitate IPM pest monitoring programs nationwide, decreasing the intensive and costly quarantine and population elimination measures that currently exist. This multi-institutional project works toward an optimized workflow where scientists can quickly 1) customize drones to meet specific purposes, 2) have them tested in FAA Test Ranges, and 3) get them certified and working in the field, while 4) cutting their cost to significantly less than what is currently available.

  19. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow

    PubMed Central

    Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.

    2012-01-01

    Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459

  20. Improving efficiency and safety in external beam radiation therapy treatment delivery using a Kaizen approach.

    PubMed

    Kapur, Ajay; Adair, Nilda; O'Brien, Mildred; Naparstek, Nikoleta; Cangelosi, Thomas; Zuvic, Petrina; Joseph, Sherin; Meier, Jason; Bloom, Beatrice; Potters, Louis

    Modern external beam radiation therapy treatment delivery processes potentially increase the number of tasks to be performed by therapists and thus opportunities for errors, yet the need to treat a large number of patients daily requires a balanced allocation of time per treatment slot. The goal of this work was to streamline the underlying workflow in such time-interval constrained processes to enhance both execution efficiency and active safety surveillance using a Kaizen approach. A Kaizen project was initiated by mapping the workflow within each treatment slot for 3 Varian TrueBeam linear accelerators. More than 90 steps were identified, and average execution times for each were measured. The time-consuming steps were stratified into a 2 × 2 matrix arranged by potential workflow improvement versus the level of corrective effort required. A work plan was created to launch initiatives with high potential for workflow improvement but modest effort to implement. Time spent on safety surveillance and average durations of treatment slots were used to assess corresponding workflow improvements. Three initiatives were implemented to mitigate unnecessary therapist motion, overprocessing of data, and wait time for data transfer defects, respectively. A fourth initiative was implemented to make the division of labor by treating therapists as well as peer review more explicit. The average duration of treatment slots reduced by 6.7% in the 9 months following implementation of the initiatives (P = .001). A reduction of 21% in duration of treatment slots was observed on 1 of the machines (P < .001). Time spent on safety reviews remained the same (20% of the allocated interval), but the peer review component increased. The Kaizen approach has the potential to improve operational efficiency and safety with quick turnaround in radiation therapy practice by addressing non-value-adding steps characteristic of individual department workflows. Higher effort opportunities are identified to guide continual downstream quality improvements. Copyright © 2017 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  1. Impact of digital imaging and communications in medicine workflow on the integration of patient demographics and ophthalmic test data.

    PubMed

    Pandit, Ravi R; Boland, Michael V

    2015-02-01

    To determine the impact of a Digital Imaging and Communications in Medicine (DICOM) workflow on the linkage of demographic information to ophthalmic testing data. Evaluation of technology. Six hundred ninety-nine visual field testing encounters performed by 6 ophthalmic technicians and the transfer error queue of 37 442 ophthalmic test results. At 3 months before and 6 and 18 months after implementation of a DICOM workflow, technicians recorded the work required to enter, confirm, or edit patient demographics in each visual field device. We also determined the proportion of imaging tests sent to an error queue for manual reconciliation because of incorrect demographic information before and 3, 6, and 18 months after the DICOM workflow was established. The proportion of testing encounters for which staff had to enter, edit, or merge patient demographics and the proportion of misfiled images. Staff entered, edited, or merged data for 48% of patients before implementation (n = 237). This decreased to 24% within 6 months and 20% within 18 months of implementing the DICOM archive (n = 230 and n = 232, respectively). Staff could locate a patient in a DICOM work list for 97% of encounters at 3 months and 99% at 18 months. Before implementation, 9.2% of the images required additional intervention to be associated with the correct patient (n = 3581). This decreased by 85% over 6 months to 1.4% (n = 9979; P < 0.01). There was an increase in the percentage of misfiled images between 6 and 18 months from 1.4% to 2.2% (n = 24 549; P < 0.01), representing an overall 76% decrease over 18 months relative to the pre-DICOM period. Implementation of a DICOM-compatible workflow in an ophthalmology clinic reduced the need to enter or edit patient demographic information into imaging or testing devices by more than 50% and reduced the need to manage misfiled images by 76%. In a clinical environment that demands both efficiency and patient safety, the DICOM workflow is an important update to current practice. Copyright © 2015 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  2. Workflows for microarray data processing in the Kepler environment.

    PubMed

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.

  3. Modeling workflow to design machine translation applications for public health practice

    PubMed Central

    Turner, Anne M.; Brownstein, Megumu K.; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2014-01-01

    Objective Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). Materials and Methods We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. Results The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. Discussion This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. Counclusion The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. PMID:25445922

  4. JTSA: an open source framework for time series abstractions.

    PubMed

    Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana

    2015-10-01

    The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large dataset. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    PubMed

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.

  6. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  7. Evaluation of Standardization of Transfer of Accountability between Inpatient Pharmacists.

    PubMed

    Tsoi, Vivian; Dewhurst, Norman; Tom, Elaine

    2018-01-01

    A compelling body of evidence supports the notion that transfer of accountability (TOA) improves communication, continuity of care, and patient safety. TOA involves the transmission and receipt of information between clinicians at each transition of care. Without a notification system alerting pharmacists to patient transfers, pharmacists' ability to seek out and complete TOA may be hindered. A standardized policy and process for TOA, with automated workflow, was implemented at the study hospital in 2015, to ensure consistency and timeliness of documentation by pharmacists. To evaluate pharmacists' adherence to and satisfaction with the TOA policy and process. A retrospective audit was conducted, using a random sample of individuals who were inpatients between June 2014 and February 2016. Transition points for TOA were identified, and the computerized pharmacy system was reviewed to determine whether TOA had been documented at each transition point. After the audit, an online survey was distributed to assess pharmacists' response to and satisfaction with the TOA policy and workflow. Before the TOA workflow was implemented, TOA documentation by pharmacists ranged from 11% (10/93) to 43% (48/111) of transitions. Eight months after implementation of the workflow, the rate of TOA documentation was 87% (68/78), exceeding the institution's target of 70%. Of the 32 pharmacists surveyed, most were satisfied with the TOA policy and agreed that the standardized workflow was simple to use, increased the number of TOAs provided and received, and improved the quality of completed TOAs. Respondents also indicated that the TOA workflow had improved patient care (mean score 4.09/5, standard deviation 0.64). The standardized TOA policy and process were well received by pharmacists, and resulted in consistent TOA documentation and a TOA documentation rate that exceeded the institutional target.

  8. DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology

    PubMed Central

    Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng

    2015-01-01

    Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437

  9. Financial and workflow analysis of radiology reporting processes in the planning phase of implementation of a speech recognition system

    NASA Astrophysics Data System (ADS)

    Whang, Tom; Ratib, Osman M.; Umamoto, Kathleen; Grant, Edward G.; McCoy, Michael J.

    2002-05-01

    The goal of this study is to determine the financial value and workflow improvements achievable by replacing traditional transcription services with a speech recognition system in a large, university hospital setting. Workflow metrics were measured at two hospitals, one of which exclusively uses a transcription service (UCLA Medical Center), and the other which exclusively uses speech recognition (West Los Angeles VA Hospital). Workflow metrics include time spent per report (the sum of time spent interpreting, dictating, reviewing, and editing), transcription turnaround, and total report turnaround. Compared to traditional transcription, speech recognition resulted in radiologists spending 13-32% more time per report, but it also resulted in reduction of report turnaround time by 22-62% and reduction of marginal cost per report by 94%. The model developed here helps justify the introduction of a speech recognition system by showing that the benefits of reduced operating costs and decreased turnaround time outweigh the cost of increased time spent per report. Whether the ultimate goal is to achieve a financial objective or to improve operational efficiency, it is important to conduct a thorough analysis of workflow before implementation.

  10. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  11. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  12. ERP implementation in hospitals: a case study.

    PubMed

    Agarwal, Divya; Garg, Poonam

    2012-01-01

    In a competitive healthcare sector, hospitals have to focus on their processes in order to deliver high-quality care while at the same time reducing costs. Many hospitals have decided to adopt one or another Enterprise Resource Planning (ERP) system to improve their businesses, but implementing an ERP system can be a demanding endeavour. The systems are so difficult to implement that some are successful; many have failed, causing multimillion dollar losses. The challenge of ERP solutions lie in implementation because they are complex, time consuming and expensive too. This paper describes the various process workflows and phases of ERP implementation at Fortis Hospital Cunningham Road, Bangalore, India. This knowledge will provide valuable insights for the researchers and practitioners to understand the different process workflows and to make informed decisions when implementing ERP in any hospital.

  13. [General practitioners' opinion and attitude towards DMPs and the change in practice routines to implement the DMP "diabetes mellitus type 2"].

    PubMed

    Miksch, Antje; Trieschmann, Johanna; Ose, Dominik; Rölz, Andreas; Heiderhoff, Marc; Szecsenyi, Joachim

    2011-01-01

    Effective implementation of disease management programmes (DMPs) in primary care practices often requires changes in practice workflows and responsibilities and acceptance by the parties involved. Within the ELSID study (evaluation study of the DMP diabetes mellitus type 2) the physicians' attitudes toward DMPs were obtained and an optimised implementation of DMPs was developed by conducting a quality management cycle with primary care practice teams. The aim was to investigate which practice workflows will have to be changed and what kind of barriers to implement these changes are perceived. In 78 primary care practices of the two German federal states of Rheinland-Pfalz and Sachsen-Anhalt a quality management cycle was conducted using a structured analysis of the current state of DMP workflows and the need for improvement identified. Subsequently, an optimised workflow was developed and targets were agreed upon. After 6 months, the study team called to inquire about the current state of implementation and, if appropriate, actual barriers to change. After 6 months, 71 practices had been interviewed by phone. 64 of them (90.1%) had agreed on at least one target (e.g., to purchase new instrumentation, to regularly discuss feedback reports, to set up a patient registry). On average three targets had been formulated, and 2 out of 3 had been implemented in the meantime. In most cases lack of time was given as the reason for non-implementation. The majority of surveyed practices perceived some need for improvement. But sufficient resources (time, staff and money) are required to ensure efficient implementation of DMPs in primary care practices and their integration with routine processes. A redefinition of responsibilities for DMPs will strengthen the role of medical assistants and promote high-quality implementation of these programmes. Copyright © 2010. Published by Elsevier GmbH.

  14. Integration of implant planning workflows into the PACS infrastructure

    NASA Astrophysics Data System (ADS)

    Gessat, Michael; Strauß, Gero; Burgert, Oliver

    2008-03-01

    The integration of imaging devices, diagnostic workstations, and image servers into Picture Archiving and Communication Systems (PACS) has had an enormous effect on the efficiency of radiology workflows. The standardization of the information exchange between the devices with the DICOM standard has been an essential precondition for that development. For surgical procedures, no such infrastructure exists. With the increasingly important role computerized planning and assistance systems play in the surgical domain, an infrastructure that unifies the communication between devices becomes necessary. In recent publications, the need for a modularized system design has been established. A reference architecture for a Therapy Imaging and Model Management System (TIMMS) has been proposed. It was accepted by the DICOM Working Group 6 as the reference architecture for DICOM developments for surgery. In this paper we propose the inclusion of implant planning systems into the PACS infrastructure. We propose a generic information model for the patient specific selection and positioning of implants from a repository according to patient image data. The information models are based on clinical workflows from ENT, cardiac, and orthopedic surgery as well as technical requirements derived from different use cases and systems. We show an exemplary implementation of the model for application in ENT surgery: the selection and positioning of an ossicular implant in the middle ear. An implant repository is stored in the PACS. It makes use of an experimental implementation of the Surface Mesh Module that is currently being developed as extension to the DICOM standard.

  15. A Sociotechnical Approach to Evaluating the Impact of ICT on Clinical Care Environments

    PubMed Central

    Li, Julie

    2010-01-01

    Introduction: Process-supporting information technology holds the potential to increase efficiency, reduce errors, and alter professional roles and responsibilities in a manner which allows improvement in the delivery of patient care. However, clashes between the model of health care work inscribed in these tools with the actual nature of work has resulted in staff resistance and decreased organisational uptake of ICT, as well as the facilitation of unexpected and negative effects in efficiency and patient safety. Sociotechnical theory provides a paradigm against which workflow and transfusion of ICT in healthcare could be better explored and understood. Design: This paper will conceptualise a formative, multi-method longitudinal evaluation process to explore the impact of ICT with an appreciation of the relationship between the social and technical systems within a clinical department. Method: Departmental culture, including clinical work processes and communication patterns will be thoroughly explored before system implementation using both quantitative and qualitative research methods. Findings will be compared with post implementation data, which will incorporate measurement of safety and workflow efficiency indicators. Discussion: Sociotechnical theory provides a paradigm against which workflow and transfusion of ICT in healthcare could be better explored and understood. However, sociotechnical and multimethod approaches to evaluation do not exist without criticism. Inherent in the protocol are limitations of sociotechnical theory and criticism of the multimethod approach; testing of the methodology in real clinical settings will serve to verify efficacy and refine the process. PMID:21594005

  16. Digitization workflows for flat sheets and packets of plants, algae, and fungi1

    PubMed Central

    Nelson, Gil; Sweeney, Patrick; Wallace, Lisa E.; Rabeler, Richard K.; Allard, Dorothy; Brown, Herrick; Carter, J. Richard; Denslow, Michael W.; Ellwood, Elizabeth R.; Germain-Aubrey, Charlotte C.; Gilbert, Ed; Gillespie, Emily; Goertzen, Leslie R.; Legler, Ben; Marchant, D. Blaine; Marsico, Travis D.; Morris, Ashley B.; Murrell, Zack; Nazaire, Mare; Neefus, Chris; Oberreiter, Shanna; Paul, Deborah; Ruhfel, Brad R.; Sasek, Thomas; Shaw, Joey; Soltis, Pamela S.; Watson, Kimberly; Weeks, Andrea; Mast, Austin R.

    2015-01-01

    Effective workflows are essential components in the digitization of biodiversity specimen collections. To date, no comprehensive, community-vetted workflows have been published for digitizing flat sheets and packets of plants, algae, and fungi, even though latest estimates suggest that only 33% of herbarium specimens have been digitally transcribed, 54% of herbaria use a specimen database, and 24% are imaging specimens. In 2012, iDigBio, the U.S. National Science Foundation’s (NSF) coordinating center and national resource for the digitization of public, nonfederal U.S. collections, launched several working groups to address this deficiency. Here, we report the development of 14 workflow modules with 7–36 tasks each. These workflows represent the combined work of approximately 35 curators, directors, and collections managers representing more than 30 herbaria, including 15 NSF-supported plant-related Thematic Collections Networks and collaboratives. The workflows are provided for download as Portable Document Format (PDF) and Microsoft Word files. Customization of these workflows for specific institutional implementation is encouraged. PMID:26421256

  17. Modeling of workflow-engaged networks on radiology transfers across a metro network.

    PubMed

    Camorlinga, Sergio; Schofield, Bruce

    2006-04-01

    Radiology metro networks bear the challenging proposition of interconnecting several hospitals in a region to provide a comprehensive diagnostic imaging service. Consequences of a poorly designed and implemented metro network could cause delays or no access at all when health care providers try to retrieve medical cases across the network. This could translate into limited diagnostic services to patients, resulting in negative impacts to the patients' medical treatment. A workflow-engaged network (WEN) is a new network paradigm. A WEN appreciates radiology workflows and priorities in using the network. A WEN greatly improves the network performance by guaranteeing that critical image transfers experience minimal delay. It adjusts network settings to ensure the application's requirements are met. This means that high-priority image transfers will have guaranteed and known delay times, whereas lower-priority traffic will have increased delays. This paper introduces a modeling to understand the benefits that WEN brings to a radiology metro network. The modeling uses actual data patterns and flows found in a hospital metro region. The workflows considered are based on the Integrating the Healthcare Enterprise profiles. This modeling has been applied to metropolitan workflows of a health region. The modeling helps identify the kind of metro network that supports data patterns and flows in a metro area. The results of the modeling show that a 155-Mb/s metropolitan area network (MAN) with WEN operates virtually equal to a normal 622-Mb/s MAN without WEN, with potential cost savings for leased line services measured in the millions of dollars per year.

  18. Nurses' experiences using a nursing information system: early stage of technology implementation.

    PubMed

    Lee, Ting-Ting

    2007-01-01

    Adoption of information technology in nursing practice has become a trend in healthcare. The impact of this technology on users has been widely studied, but little attention has been given to its influence at the beginning stage of implementation. Knowing the barriers to adopting technology could shorten this transition stage and minimize its negative influences. The purpose of this study was to explore nurses' experiences in the early stage of implementing a nursing information system. Focus groups were used to collect data at a medical center in Taiwan. The results showed that nurses had problems with the system's content design, had insufficient training, were concerned about data security, were stressed by added work, and experienced poor interdisciplinary cooperation. To smooth this beginning stage, the author recommends involving nurses early in the system design, providing sufficient training in keyboard entry skills, redesigning workflow, and improving interdisciplinary communication.

  19. Using Semantic Components to Represent Dynamics of an Interdisciplinary Healthcare Team in a Multi-Agent Decision Support System.

    PubMed

    Wilk, Szymon; Kezadri-Hamiaz, Mounira; Rosu, Daniela; Kuziemsky, Craig; Michalowski, Wojtek; Amyot, Daniel; Carrier, Marc

    2016-02-01

    In healthcare organizations, clinical workflows are executed by interdisciplinary healthcare teams (IHTs) that operate in ways that are difficult to manage. Responding to a need to support such teams, we designed and developed the MET4 multi-agent system that allows IHTs to manage patients according to presentation-specific clinical workflows. In this paper, we describe a significant extension of the MET4 system that allows for supporting rich team dynamics (understood as team formation, management and task-practitioner allocation), including selection and maintenance of the most responsible physician and more complex rules of selecting practitioners for the workflow tasks. In order to develop this extension, we introduced three semantic components: (1) a revised ontology describing concepts and relations pertinent to IHTs, workflows, and managed patients, (2) a set of behavioral rules describing the team dynamics, and (3) an instance base that stores facts corresponding to instances of concepts from the ontology and to relations between these instances. The semantic components are represented in first-order logic and they can be automatically processed using theorem proving and model finding techniques. We employ these techniques to find models that correspond to specific decisions controlling the dynamics of IHT. In the paper, we present the design of extended MET4 with a special focus on the new semantic components. We then describe its proof-of-concept implementation using the WADE multi-agent platform and the Z3 solver (theorem prover/model finder). We illustrate the main ideas discussed in the paper with a clinical scenario of an IHT managing a patient with chronic kidney disease.

  20. Data Management System

    NASA Technical Reports Server (NTRS)

    1997-01-01

    CENTRA 2000 Inc., a wholly owned subsidiary of Auto-trol technology, obtained permission to use software originally developed at Johnson Space Center for the Space Shuttle and early Space Station projects. To support their enormous information-handling needs, a product data management, electronic document management and work-flow system was designed. Initially, just 33 database tables comprised the original software, which was later expanded to about 100 tables. This system, now called CENTRA 2000, is designed for quick implementation and supports the engineering process from preliminary design through release-to-production. CENTRA 2000 can also handle audit histories and provides a means to ensure new information is distributed. The product has 30 production sites worldwide.

  1. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  2. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  3. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    PubMed Central

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  4. Quantum mechanics implementation in drug-design workflows: does it really help?

    PubMed

    Arodola, Olayide A; Soliman, Mahmoud Es

    2017-01-01

    The pharmaceutical industry is progressively operating in an era where development costs are constantly under pressure, higher percentages of drugs are demanded, and the drug-discovery process is a trial-and-error run. The profit that flows in with the discovery of new drugs has always been the motivation for the industry to keep up the pace and keep abreast with the endless demand for medicines. The process of finding a molecule that binds to the target protein using in silico tools has made computational chemistry a valuable tool in drug discovery in both academic research and pharmaceutical industry. However, the complexity of many protein-ligand interactions challenges the accuracy and efficiency of the commonly used empirical methods. The usefulness of quantum mechanics (QM) in drug-protein interaction cannot be overemphasized; however, this approach has little significance in some empirical methods. In this review, we discuss recent developments in, and application of, QM to medically relevant biomolecules. We critically discuss the different types of QM-based methods and their proposed application to incorporating them into drug-design and -discovery workflows while trying to answer a critical question: are QM-based methods of real help in drug-design and -discovery research and industry?

  5. Approach to implementing a DICOM network: incorporate both economics and workflow adaptation

    NASA Astrophysics Data System (ADS)

    Beaver, S. Merritt; Sippel-Schmidt, Teresa M.

    1995-05-01

    This paper describes an approach to aide in the decision-making process for the justification and design of a digital image and information management system. It identifies key technical and clinical issues that need to be addressed by a healthcare institution during this process. Some issues identified here are very controversial and may take months or years for a department to determine solutions which meet their specific staffing, financial, and technical needs.

  6. An architecture model for multiple disease management information systems.

    PubMed

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  7. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    PubMed

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  8. Hospital information systems: experience at the fully digitized Seoul National University Bundang Hospital.

    PubMed

    Yoo, Sooyoung; Hwang, Hee; Jheon, Sanghoon

    2016-08-01

    The different levels of health information technology (IT) adoption and its integration into hospital workflow can affect the maximization of the benefits of using of health IT. We aimed at sharing our experiences and the journey to the successful adoption of health IT over 13 years at a tertiary university hospital in South Korea. The integrated system of comprehensive applications for direct care, support care, and smart care has been implemented with the latest IT and a rich user information platform, achieving the fully digitized hospital. The users experience design methodology, barcode and radio-frequency identification (RFID) technologies, smartphone and mobile technologies, and data analytics were integrated into hospital workflow. Applications for user-centered electronic medical record (EMR) and clinical decision support (CDS), closed loop medication administration (CLMA), mobile EMR and dashboard system for care coordination, clinical data warehouse (CDW) system, and patient engagement solutions were designed and developed to improve quality of care, work efficiency, and patient safety. We believe that comprehensive electronic health record systems and patient-centered smart hospital applications will go a long way in ensuring seamless patient care and experience.

  9. Using Workflows to Explore and Optimise Named Entity Recognition for Chemistry

    PubMed Central

    Kolluru, BalaKrishna; Hawizy, Lezan; Murray-Rust, Peter; Tsujii, Junichi; Ananiadou, Sophia

    2011-01-01

    Chemistry text mining tools should be interoperable and adaptable regardless of system-level implementation, installation or even programming issues. We aim to abstract the functionality of these tools from the underlying implementation via reconfigurable workflows for automatically identifying chemical names. To achieve this, we refactored an established named entity recogniser (in the chemistry domain), OSCAR and studied the impact of each component on the net performance. We developed two reconfigurable workflows from OSCAR using an interoperable text mining framework, U-Compare. These workflows can be altered using the drag-&-drop mechanism of the graphical user interface of U-Compare. These workflows also provide a platform to study the relationship between text mining components such as tokenisation and named entity recognition (using maximum entropy Markov model (MEMM) and pattern recognition based classifiers). Results indicate that, for chemistry in particular, eliminating noise generated by tokenisation techniques lead to a slightly better performance than others, in terms of named entity recognition (NER) accuracy. Poor tokenisation translates into poorer input to the classifier components which in turn leads to an increase in Type I or Type II errors, thus, lowering the overall performance. On the Sciborg corpus, the workflow based system, which uses a new tokeniser whilst retaining the same MEMM component, increases the F-score from 82.35% to 84.44%. On the PubMed corpus, it recorded an F-score of 84.84% as against 84.23% by OSCAR. PMID:21633495

  10. Using workflows to explore and optimise named entity recognition for chemistry.

    PubMed

    Kolluru, Balakrishna; Hawizy, Lezan; Murray-Rust, Peter; Tsujii, Junichi; Ananiadou, Sophia

    2011-01-01

    Chemistry text mining tools should be interoperable and adaptable regardless of system-level implementation, installation or even programming issues. We aim to abstract the functionality of these tools from the underlying implementation via reconfigurable workflows for automatically identifying chemical names. To achieve this, we refactored an established named entity recogniser (in the chemistry domain), OSCAR and studied the impact of each component on the net performance. We developed two reconfigurable workflows from OSCAR using an interoperable text mining framework, U-Compare. These workflows can be altered using the drag-&-drop mechanism of the graphical user interface of U-Compare. These workflows also provide a platform to study the relationship between text mining components such as tokenisation and named entity recognition (using maximum entropy Markov model (MEMM) and pattern recognition based classifiers). Results indicate that, for chemistry in particular, eliminating noise generated by tokenisation techniques lead to a slightly better performance than others, in terms of named entity recognition (NER) accuracy. Poor tokenisation translates into poorer input to the classifier components which in turn leads to an increase in Type I or Type II errors, thus, lowering the overall performance. On the Sciborg corpus, the workflow based system, which uses a new tokeniser whilst retaining the same MEMM component, increases the F-score from 82.35% to 84.44%. On the PubMed corpus, it recorded an F-score of 84.84% as against 84.23% by OSCAR.

  11. aTRAM 2.0: An Improved, Flexible Locus Assembler for NGS Data

    PubMed Central

    Allen, Julie M; LaFrance, Raphael; Folk, Ryan A; Johnson, Kevin P; Guralnick, Robert P

    2018-01-01

    Massive strides have been made in technologies for collecting genome-scale data. However, tools for efficiently and flexibly assembling raw outputs into downstream analytical workflows are still nascent. aTRAM 1.0 was designed to assemble any locus from genome sequencing data but was neither optimized for efficiency nor able to serve as a single toolkit for all assembly needs. We have completely re-implemented aTRAM and redesigned its structure for faster read retrieval while adding a number of key features to improve flexibility and functionality. The software can now (1) assemble single- or paired-end data, (2) utilize both read directions in the database, (3) use an additional de novo assembly module, and (4) leverage new built-in pipelines to automate common workflows in phylogenomics. Owing to reimplementation of databasing strategies, we demonstrate that aTRAM 2.0 is much faster across all applications compared to the previous version. PMID:29881251

  12. COSMOS: Python library for massively parallel workflows

    PubMed Central

    Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.

    2014-01-01

    Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428

  13. The impact of computerized provider order entry systems on inpatient clinical workflow: a literature review.

    PubMed

    Niazkhani, Zahra; Pirnejad, Habibollah; Berg, Marc; Aarts, Jos

    2009-01-01

    Previous studies have shown the importance of workflow issues in the implementation of CPOE systems and patient safety practices. To understand the impact of CPOE on clinical workflow, we developed a conceptual framework and conducted a literature search for CPOE evaluations between 1990 and June 2007. Fifty-one publications were identified that disclosed mixed effects of CPOE systems. Among the frequently reported workflow advantages were the legible orders, remote accessibility of the systems, and the shorter order turnaround times. Among the frequently reported disadvantages were the time-consuming and problematic user-system interactions, and the enforcement of a predefined relationship between clinical tasks and between providers. Regarding the diversity of findings in the literature, we conclude that more multi-method research is needed to explore CPOE's multidimensional and collective impact on especially collaborative workflow.

  14. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images.

    PubMed

    Ameisen, David; Deroulers, Christophe; Perrier, Valérie; Bouhidel, Fatiha; Battistella, Maxime; Legrès, Luc; Janin, Anne; Bertheau, Philippe; Yunès, Jean-Baptiste

    2014-01-01

    Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow.

  15. How credit scores can make a difference for your revenue cycle.

    PubMed

    Jackson, Garett

    2008-02-01

    Questions an organization should consider before implementing credit scoring include: What action will be taken once a score is obtained? Should the score impact the workflow? Are the appropriate tools available to alter the workflow? How will accounts receivable staff be redirected or allocated? What is the risk tolerance for scoring mistakes?

  16. Defining Usability Heuristics for Adoption and Efficiency of an Electronic Workflow Document Management System

    ERIC Educational Resources Information Center

    Fuentes, Steven

    2017-01-01

    Usability heuristics have been established for different uses and applications as general guidelines for user interfaces. These can affect the implementation of industry solutions and play a significant role regarding cost reduction and process efficiency. The area of electronic workflow document management (EWDM) solutions, also known as…

  17. New data model with better functionality for VLab

    NASA Astrophysics Data System (ADS)

    da Silveira, P. R.; Wentzcovitch, R. M.; Karki, B. B.

    2009-12-01

    The VLab infrastructure and architecture was further developed to allow for several new features. First, workflows for first principles calculations of thermodynamics properties and static elasticity programmed in Java as Web Services can now be executed by multiple users. Second, jobs generated by these workflows can now be executed in batch in multiple servers. A simple internal schedule was implemented to handle hundreds of execution packages generated by multiple users and avoid the overload on servers. Third, a new data model was implemented to guarantee integrity of a project (workflow execution) in case of failure. The latter can happen in an execution package or in a workflow phase. By recording all executed steps of a project, its execution can be resumed after dynamic alteration of parameters through the VLab Portal. Fourth, batch jobs can also be monitored through the portal. Now, better and faster interaction with servers is achieved using Ajax technology. Finally, plots are now created on the Vlab server using Gnuplot 4.2.2. Research supported by NSF grants ATM 0428774 (VLab). Vlab is hosted by the Minnesota Supercomputing Institute.

  18. Scalable and cost-effective NGS genotyping in the cloud.

    PubMed

    Souilmi, Yassine; Lancaster, Alex K; Jung, Jae-Yoon; Rizzo, Ettore; Hawkins, Jared B; Powles, Ryan; Amzazi, Saaïd; Ghazal, Hassan; Tonellato, Peter J; Wall, Dennis P

    2015-10-15

    While next-generation sequencing (NGS) costs have plummeted in recent years, cost and complexity of computation remain substantial barriers to the use of NGS in routine clinical care. The clinical potential of NGS will not be realized until robust and routine whole genome sequencing data can be accurately rendered to medically actionable reports within a time window of hours and at scales of economy in the 10's of dollars. We take a step towards addressing this challenge, by using COSMOS, a cloud-enabled workflow management system, to develop GenomeKey, an NGS whole genome analysis workflow. COSMOS implements complex workflows making optimal use of high-performance compute clusters. Here we show that the Amazon Web Service (AWS) implementation of GenomeKey via COSMOS provides a fast, scalable, and cost-effective analysis of both public benchmarking and large-scale heterogeneous clinical NGS datasets. Our systematic benchmarking reveals important new insights and considerations to produce clinical turn-around of whole genome analysis optimization and workflow management including strategic batching of individual genomes and efficient cluster resource configuration.

  19. Top ten challenges when interfacing a laboratory information system to an electronic health record: Experience at a large academic medical center.

    PubMed

    Petrides, Athena K; Tanasijevic, Milenko J; Goonan, Ellen M; Landman, Adam B; Kantartjis, Michalis; Bates, David W; Melanson, Stacy E F

    2017-10-01

    Recent U.S. government regulations incentivize implementation of an electronic health record (EHR) with computerized order entry and structured results display. Many institutions have also chosen to interface their EHR to their laboratory information system (LIS). Reported long-term benefits include increased efficiency and improved quality and safety. In order to successfully implement an interfaced EHR-LIS, institutions must plan years in advance and anticipate the impact of an integrated system. It can be challenging to fully understand the technical, workflow and resource aspects and adequately prepare for a potentially protracted system implementation and the subsequent stabilization. We describe the top ten challenges that we encountered in our clinical laboratories following the implementation of an interfaced EHR-LIS and offer suggestions on how to overcome these challenges. This study was performed at a 777-bed, tertiary care center which recently implemented an interfaced EHR-LIS. Challenges were recorded during EHR-LIS implementation and stabilization and the authors describe the top ten. Our top ten challenges were selection and harmonization of test codes, detailed training for providers on test ordering, communication with EHR provider champions during the build process, fluid orders and collections, supporting specialized workflows, sufficient reports and metrics, increased volume of inpatient venipunctures, adequate resources during stabilization, unanticipated changes to laboratory workflow and ordering specimens for anatomic pathology. A few suggestions to overcome these challenges include regular meetings with clinical champions, advanced considerations of reports and metrics that will be needed, adequate training of laboratory staff on new workflows in the EHR and defining all tests including anatomic pathology in the LIS. EHR-LIS implementations have many challenges requiring institutions to adapt and develop new infrastructures. This article should be helpful to other institutions facing or undergoing a similar endeavor. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Towards a Unified Architecture for Data-Intensive Seismology in VERCE

    NASA Astrophysics Data System (ADS)

    Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.

    2013-12-01

    Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.

  1. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  2. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics

    PubMed Central

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe

    2015-01-01

    Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831

  3. Separating Business Logic from Medical Knowledge in Digital Clinical Workflows Using Business Process Model and Notation and Arden Syntax.

    PubMed

    de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea

    2018-01-01

    Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.

  4. Implementing conventional logic unconventionally: photochromic molecular populations as registers and logic gates.

    PubMed

    Chaplin, J C; Russell, N A; Krasnogor, N

    2012-07-01

    In this paper we detail experimental methods to implement registers, logic gates and logic circuits using populations of photochromic molecules exposed to sequences of light pulses. Photochromic molecules are molecules with two or more stable states that can be switched reversibly between states by illuminating with appropriate wavelengths of radiation. Registers are implemented by using the concentration of molecules in each state in a given sample to represent an integer value. The register's value can then be read using the intensity of a fluorescence signal from the sample. Logic gates have been implemented using a register with inputs in the form of light pulses to implement 1-input/1-output and 2-input/1-output logic gates. A proof of concept logic circuit is also demonstrated; coupled with the software workflow describe the transition from a circuit design to the corresponding sequence of light pulses. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  6. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  7. The impact of electronic health records on workflow and financial measures in primary care practices.

    PubMed

    Fleming, Neil S; Becker, Edmund R; Culler, Steven D; Cheng, Dunlei; McCorkle, Russell; da Graca, Briget; Ballard, David J

    2014-02-01

    To estimate a commercially available ambulatory electronic health record's (EHR's) impact on workflow and financial measures. Administrative, payroll, and billing data were collected for 26 primary care practices in a fee-for-service network that rolled out an EHR on a staggered schedule from June 2006 through December 2008. An interrupted time series design was used. Staffing, visit intensity, productivity, volume, practice expense, payments received, and net income data were collected monthly for 2004-2009. Changes were evaluated 1-6, 7-12, and >12 months postimplementation. Data were accessed through a SQLserver database, transformed into SAS®, and aggregated by practice. Practice-level data were divided by full-time physician equivalents for comparisons across practices by month. Staffing and practice expenses increased following EHR implementation (3 and 6 percent after 12 months). Productivity, volume, and net income decreased initially but recovered to/close to preimplementation levels after 12 months. Visit intensity did not change significantly, and a secular trend offset the decrease in payments received. Expenses increased and productivity decreased following EHR implementation, but not as much or as persistently as might be expected. Longer term effects still need to be examined. © Health Research and Educational Trust.

  8. Biologically Relevant Heterogeneity: Metrics and Practical Insights.

    PubMed

    Gough, Albert; Stern, Andrew M; Maier, John; Lezon, Timothy; Shun, Tong-Ying; Chennubhotla, Chakra; Schurdak, Mark E; Haney, Steven A; Taylor, D Lansing

    2017-03-01

    Heterogeneity is a fundamental property of biological systems at all scales that must be addressed in a wide range of biomedical applications, including basic biomedical research, drug discovery, diagnostics, and the implementation of precision medicine. There are a number of published approaches to characterizing heterogeneity in cells in vitro and in tissue sections. However, there are no generally accepted approaches for the detection and quantitation of heterogeneity that can be applied in a relatively high-throughput workflow. This review and perspective emphasizes the experimental methods that capture multiplexed cell-level data, as well as the need for standard metrics of the spatial, temporal, and population components of heterogeneity. A recommendation is made for the adoption of a set of three heterogeneity indices that can be implemented in any high-throughput workflow to optimize the decision-making process. In addition, a pairwise mutual information method is suggested as an approach to characterizing the spatial features of heterogeneity, especially in tissue-based imaging. Furthermore, metrics for temporal heterogeneity are in the early stages of development. Example studies indicate that the analysis of functional phenotypic heterogeneity can be exploited to guide decisions in the interpretation of biomedical experiments, drug discovery, diagnostics, and the design of optimal therapeutic strategies for individual patients.

  9. Using conceptual work products of health care to design health IT.

    PubMed

    Berry, Andrew B L; Butler, Keith A; Harrington, Craig; Braxton, Melissa O; Walker, Amy J; Pete, Nikki; Johnson, Trevor; Oberle, Mark W; Haselkorn, Jodie; Paul Nichol, W; Haselkorn, Mark

    2016-02-01

    This paper introduces a new, model-based design method for interactive health information technology (IT) systems. This method extends workflow models with models of conceptual work products. When the health care work being modeled is substantially cognitive, tacit, and complex in nature, graphical workflow models can become too complex to be useful to designers. Conceptual models complement and simplify workflows by providing an explicit specification for the information product they must produce. We illustrate how conceptual work products can be modeled using standard software modeling language, which allows them to provide fundamental requirements for what the workflow must accomplish and the information that a new system should provide. Developers can use these specifications to envision how health IT could enable an effective cognitive strategy as a workflow with precise information requirements. We illustrate the new method with a study conducted in an outpatient multiple sclerosis (MS) clinic. This study shows specifically how the different phases of the method can be carried out, how the method allows for iteration across phases, and how the method generated a health IT design for case management of MS that is efficient and easy to use. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Contextual cloud-based service oriented architecture for clinical workflow.

    PubMed

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  11. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    PubMed Central

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  12. An Auto-management Thesis Program WebMIS Based on Workflow

    NASA Astrophysics Data System (ADS)

    Chang, Li; Jie, Shi; Weibo, Zhong

    An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.

  13. Ergonomic design for dental offices.

    PubMed

    Ahearn, David J; Sanders, Martha J; Turcotte, Claudia

    2010-01-01

    The increasing complexity of the dental office environment influences productivity and workflow for dental clinicians. Advances in technology, and with it the range of products needed to provide services, have led to sprawl in operatory setups and the potential for awkward postures for dental clinicians during the delivery of oral health services. Although ergonomics often addresses the prevention of musculoskeletal disorders for specific populations of workers, concepts of workflow and productivity are integral to improved practice in work environments. This article provides suggestions for improving workflow and productivity for dental clinicians. The article applies ergonomic principles to dental practice issues such as equipment and supply management, office design, and workflow management. Implications for improved ergonomic processes and future research are explored.

  14. Clinician user involvement in the real world: Designing an electronic tool to improve interprofessional communication and collaboration in a hospital setting.

    PubMed

    Tang, Terence; Lim, Morgan E; Mansfield, Elizabeth; McLachlan, Alexander; Quan, Sherman D

    2018-02-01

    User involvement is vital to the success of health information technology implementation. However, involving clinician users effectively and meaningfully in complex healthcare organizations remains challenging. The objective of this paper is to share our real-world experience of applying a variety of user involvement methods in the design and implementation of a clinical communication and collaboration platform aimed at facilitating care of complex hospitalized patients by an interprofessional team of clinicians. We designed and implemented an electronic clinical communication and collaboration platform in a large community teaching hospital. The design team consisted of both technical and healthcare professionals. Agile software development methodology was used to facilitate rapid iterative design and user input. We involved clinician users at all stages of the development lifecycle using a variety of user-centered, user co-design, and participatory design methods. Thirty-six software releases were delivered over 24 months. User involvement has resulted in improvement in user interface design, identification of software defects, creation of new modules that facilitated workflow, and identification of necessary changes to the scope of the project early on. A variety of user involvement methods were complementary and benefited the design and implementation of a complex health IT solution. Combining these methods with agile software development methodology can turn designs into functioning clinical system to support iterative improvement. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Routine Digital Pathology Workflow: The Catania Experience

    PubMed Central

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Introduction: Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory. PMID:29416914

  16. Routine Digital Pathology Workflow: The Catania Experience.

    PubMed

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.

  17. Bridging the guideline implementation gap: a systematic, document-centered approach to guideline implementation.

    PubMed

    Shiffman, Richard N; Michel, George; Essaihi, Abdelwaheb; Thornquist, Elizabeth

    2004-01-01

    A gap exists between the information contained in published clinical practice guidelines and the knowledge and information that are necessary to implement them. This work describes a process to systematize and make explicit the translation of document-based knowledge into workflow-integrated clinical decision support systems. This approach uses the Guideline Elements Model (GEM) to represent the guideline knowledge. Implementation requires a number of steps to translate the knowledge contained in guideline text into a computable format and to integrate the information into clinical workflow. The steps include: (1) selection of a guideline and specific recommendations for implementation, (2) markup of the guideline text, (3) atomization, (4) deabstraction and (5) disambiguation of recommendation concepts, (6) verification of rule set completeness, (7) addition of explanations, (8) building executable statements, (9) specification of origins of decision variables and insertions of recommended actions, (10) definition of action types and selection of associated beneficial services, (11) choice of interface components, and (12) creation of requirement specification. The authors illustrate these component processes using examples drawn from recent experience translating recommendations from the National Heart, Lung, and Blood Institute's guideline on management of chronic asthma into a workflow-integrated decision support system that operates within the Logician electronic health record system. Using the guideline document as a knowledge source promotes authentic translation of domain knowledge and reduces the overall complexity of the implementation task. From this framework, we believe that a better understanding of activities involved in guideline implementation will emerge.

  18. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  19. Experimental Test Rig for Optimal Control of Flexible Space Robotic Arms

    DTIC Science & Technology

    2016-12-01

    was used to refine the test bed design and the experimental workflow. Three concepts incorporated various strategies to design a robust flexible link...used to refine the test bed design and the experimental workflow. Three concepts incorporated various strategies to design a robust flexible link... designed to perform the experimentation . The first and second concepts use traditional elastic springs in varying configurations while a third uses a

  20. SU-E-J-73: Extension of a Clinical OIS/EMR/R&V System to Deliver Safe and Efficient Adaptive Plan-Of-The-Day Treatments Using a Fully Customizable Plan-Library-Based Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhiat, A.; Elekta, Sunnyvale, CA; Kanis, A.P.

    Purpose: To extend a clinical Record and Verify (R&V) system to enable a safe and fast workflow for Plan-of-the-Day (PotD) adaptive treatments based on patient-specific plan libraries. Methods: Plan libraries for PotD adaptive treatments contain for each patient several pre-treatment generated treatment plans. They may be generated for various patient anatomies or CTV-PTV margins. For each fraction, a Cone Beam CT scan is acquired to support the selection of the plan that best fits the patient’s anatomy-of-the-day. To date, there are no commercial R&V systems that support PotD delivery strategies. Consequently, the clinical workflow requires many manual interventions. Moreover, multiplemore » scheduled plans have a high risk of excessive dose delivery. In this work we extended a commercial R&V system (MOSAIQ) to support PotD workflows using IQ-scripting. The PotD workflow was designed after extensive risk analysis of the manual procedure, and all identified risks were incorporated as logical checks. Results: All manual PotD activities were automated. The workflow first identifies if the patient is scheduled for PotD, then performs safety checks, and continues to treatment plan selection only if no issues were found. The user selects the plan to deliver from a list of candidate plans. After plan selection, the workflow makes the treatment fields of the selected plan available for delivery by adding them to the treatment calendar. Finally, control is returned to the R&V system to commence treatment. Additional logic was added to incorporate off-line changes such as updating the plan library. After extensive testing including treatment fraction interrupts and plan-library updates during the treatment course, the workflow is running successfully in a clinical pilot, in which 35 patients have been treated since October 2014. Conclusion: We have extended a commercial R&V system for improved safety and efficiency in library-based adaptive strategies enabling a wide-spread implementation of those strategies. This work was in part funded by a research grant of Elekta AB, Stockholm, Sweden.« less

  1. Care provider order entry (CPOE): a perspective on factors leading to success or to failure.

    PubMed

    Ozdas, A; Miller, R A

    2007-01-01

    Authors provide a perspective on factors leading to successful care provider order entry (CPOE) implementations. Viewpoint of authors supported by background literature review. Authors review both benefits and challenges related to CPOE implementation using three guiding principles: (1) a clinical approach to clinical systems, which claims that CPOE implementation is analogous to a "good" clinician delivering care to a patient; (2) a commitment to quality, which advocates that no compromises should be made in implementing system functionality and clinical system content - the highest objective for CPOE implementation is to provide better quality of care and increased safety for patients; (3) a commitment to fairness, as evidenced by respect for individuals and support of local autonomy, which advocates for minimizing disruptions to clinician-users' workflows, and adequate local control over CPOE system design and evolution, including clinical content management. Past experiences with CPOE implementation can inform future installation attempts. Sociocultural factors dominate in determining the success of implementation, and should govern technical factors.

  2. Clinical commissioning of an in vivo range verification system for prostate cancer treatment with anterior and anterior oblique proton beams

    NASA Astrophysics Data System (ADS)

    Hoesl, M.; Deepak, S.; Moteabbed, M.; Jassens, G.; Orban, J.; Park, Y. K.; Parodi, K.; Bentefour, E. H.; Lu, H. M.

    2016-04-01

    The purpose of this work is the clinical commissioning of a recently developed in vivo range verification system (IRVS) for treatment of prostate cancer by anterior and anterior oblique proton beams. The IRVS is designed to perform a complete workflow for pre-treatment range verification and adjustment. It contains specifically designed dosimetry and electronic hardware and a specific software for workflow control with database connection to the treatment and imaging systems. An essential part of the IRVS system is an array of Si-diode detectors, designed to be mounted to the endorectal water balloon routinely used for prostate immobilization. The diodes can measure dose rate as function of time from which the water equivalent path length (WEPL) and the dose received are extracted. The former is used for pre-treatment beam range verification and correction, if necessary, while the latter is to monitor the dose delivered to patient rectum during the treatment and serves as an additional verification. The entire IRVS workflow was tested for anterior and 30 degree inclined proton beam in both solid water and anthropomorphic pelvic phantoms, with the measured WEPL and rectal doses compared to the treatment plan. Gafchromic films were also used for measurement of the rectal dose and compared to IRVS results. The WEPL measurement accuracy was in the order of 1 mm and after beam range correction, the dose received by the rectal wall were 1.6% and 0.4% from treatment planning, respectively, for the anterior and anterior oblique field. We believe the implementation of IRVS would make the treatment of prostate with anterior proton beams more accurate and reliable.

  3. Implementation of a 'lean' cytopathology service: towards routine same-day reporting.

    PubMed

    Hewer, Ekkehard; Hammer, Caroline; Fricke-Vetsch, Daniela; Baumann, Cinzia; Perren, Aurel; Schmitt, Anja M

    2018-05-01

    To systematically assess the effects of a Lean management intervention in an academic cytopathology service. We monitored outcomes including specimen turnaround times during stepwise implementation of a lean cytopathology workflow for gynaecological and non-gynaecological cytology. The intervention resulted in a major reduction of turnaround times for both gynaecological (3rd quartile 4.1 vs 2.3 working days) and non-gynaecological cytology (3rd quartile 1.9 vs. 1.2 working days). Introduction of fully electronic reporting had additional effect over continuous staining of slides alone. The rate of non-gynaecological specimens reported the same day increased from 4.5% to 56.5% of specimens received before noon. Lean management principles provide a useful framework for organization of a cytopathology workflow. Stepwise implementation beginning with a simplified gynaecological cytology workflow allowed involved staff to monitor the effects of individual changes and allowed for a smooth transition. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Using AI and Semantic Web Technologies to attack Process Complexity in Open Systems

    NASA Astrophysics Data System (ADS)

    Thompson, Simon; Giles, Nick; Li, Yang; Gharib, Hamid; Nguyen, Thuc Duong

    Recently many vendors and groups have advocated using BPEL and WS-BPEL as a workflow language to encapsulate business logic. While encapsulating workflow and process logic in one place is a sensible architectural decision the implementation of complex workflows suffers from the same problems that made managing and maintaining hierarchical procedural programs difficult. BPEL lacks constructs for logical modularity such as the requirements construct from the STL [12] or the ability to adapt constructs like pure abstract classes for the same purpose. We describe a system that uses semantic web and agent concepts to implement an abstraction layer for BPEL based on the notion of Goals and service typing. AI planning was used to enable process engineers to create and validate systems that used services and goals as first class concepts and compiled processes at run time for execution.

  5. Effects of the I-PASS Nursing Handoff Bundle on communication quality and workflow.

    PubMed

    Starmer, Amy J; Schnock, Kumiko O; Lyons, Aimee; Hehn, Rebecca S; Graham, Dionne A; Keohane, Carol; Landrigan, Christopher P

    2017-12-01

    Handoff communication errors are a leading source of sentinel events. We sought to determine the impact of a handoff improvement programme for nurses. We conducted a prospective pre-post intervention study on a paediatric intensive care unit in 2011-2012. The I-PASS Nursing Handoff Bundle intervention consisted of educational training, verbal handoff I-PASS mnemonic implementation, and visual materials to provide reinforcement and sustainability. We developed handoff direct observation and time motion workflow assessment tools to measure: (1) quality of the verbal handoff, including interruption frequency and presence of key handoff data elements; and (2) duration of handoff and other workflow activities. I-PASS implementation was associated with improvements in verbal handoff communications, including inclusion of illness severity assessment (37% preintervention vs 67% postintervention, p=0.001), patient summary (81% vs 95%, p=0.05), to do list (35% vs 100%, p<0.001) and an opportunity for the receiving nurse to ask questions (34% vs 73%, p<0.001). Overall, 13/21 (62%) of verbal handoff data elements were more likely to be present following implementation whereas no data elements were less likely present. Implementation was associated with a decrease in interruption frequency pre versus post intervention (67% vs 40% of handoffs with interruptions, p=0.005) without a change in the median handoff duration (18.8 min vs 19.9 min, p=0.48) or changes in time spent in direct or indirect patient care activities. Implementation of the I-PASS Nursing Handoff Bundle was associated with widespread improvements in the verbal handoff process without a negative impact on nursing workflow. Implementation of I-PASS for nurses may therefore have the potential to significantly reduce medical errors and improve patient safety. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Implementation of HbA1c Point of Care Testing in 3 German Medical Practices: Impact on Workflow and Physician, Staff, and Patient Satisfaction.

    PubMed

    Patzer, Karl-Heinz; Ardjomand, Payam; Göhring, Katharina; Klempt, Guido; Patzelt, Andreas; Redzich, Markus; Zebrowski, Mathias; Emmerich, Susanne; Schnell, Oliver

    2018-05-01

    Medical practices face challenges of time and cost pressures with scarce resources. Point-of-care testing (POCT) has the potential to accelerate processes compared to central laboratory testing and can increase satisfaction of physicians, staff members, and patients. The objective of this study was to evaluate the effects of introducing HbA1c POCT in practices specialized in diabetes. Three German practices that manage 400, 550, and 950 diabetes patients per year participated in this evaluation. The workflow and required time before and after POCT implementation (device: Alere Afinion AS100 Analyzer) was evaluated in each practice. Physician (n = 5), staff (n = 9), and patient (n = 298) satisfaction was assessed with questionnaires and interviews. After POCT implementation the number of required visits scheduled was reduced by 80% (88% vs 17.6%, P < .0001), the number of venous blood collections by 75% (91% vs 23%, P < .0001). Of patients, 82% (vs 13% prior to POCT implementation) were able to discuss their HbA1c values with treating physicians immediately during their first visit ( P < .0001). In two of the practices the POCT process resulted in significant time savings of approximately 20 and 22 working days per 1000 patients per year (95% CI 2-46; 95% CI 10-44). All physicians indicated that POCT HbA1c implementation improved the practice workflow and all experienced a relief of burden for the office and the patients. All staff members indicated that they found the POCT measurement easy to perform and experienced a relief of burden. The majority (61.3%) of patients found the capillary blood collection more pleasant and 83% saw an advantage in the immediate availability of HbA1c results. The implementation of HbA1c POCT leads to an improved practice workflow and increases satisfaction of physicians, staff members and patients.

  7. Workflow Design Using Fragment Composition

    NASA Astrophysics Data System (ADS)

    Mosser, Sébastien; Blay-Fornarino, Mireille; France, Robert

    The Service-Oriented Architecture (Soa) paradigm supports the assembly of atomic services to create applications that implement complex business processes. Assembly can be accomplished by service orchestrations defined by Soa architects. The Adore method allows Soa architects to model complex orchestrations of services by composing models of smaller orchestrations called orchestration fragments. The Adore method can also be used to weave fragments that address new concerns into existing application models. In this paper we illustrate how the Adore method can be used to separate and compose process aspects in a Soa design of the Car Crash Crisis Management System. The paper also includes a discussion of the benefits and limitations of the Adore method.

  8. Trigger Menu-aware Monitoring for the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Hoad, Xanthe; ATLAS Collaboration

    2017-10-01

    We present a“trigger menu-aware” monitoring system designed for the Run-2 data-taking of the ATLAS experiment at the LHC. Unlike Run-1, where a change in the trigger menu had to be matched by the installation of a new software release at Tier-0, the new monitoring system aims to simplify the ATLAS operational workflows. This is achieved by integrating monitoring updates in a quick and flexible manner via an Oracle DB interface. We present the design and the implementation of the menu-aware monitoring, along with lessons from the operational experience of the new system with the 2016 collision data.

  9. Preappointment testing for BRAF/KIT mutation in advanced melanoma: a model in molecular data delivery for individualized medicine.

    PubMed

    Mounajjed, Taofic; Brown, Char L; Stern, Therese K; Bjorheim, Annette M; Bridgeman, Andrew J; Rumilla, Kandelaria M; McWilliams, Robert R; Flotte, Thomas J

    2014-11-01

    The emergence of individualized medicine is driven by developments in precision diagnostics, epitomized by molecular testing. Because treatment decisions are being made based on such molecular data, data management is gaining major importance. Among data management challenges, creating workflow solutions for timely delivery of molecular data has become pivotal. This study aims to design and implement a scalable process that permits preappointment BRAF/KIT mutation analysis in melanoma patients, allowing molecular results necessary for treatment plans to be available before the patient's appointment. Process implementation aims to provide a model for efficient molecular data delivery for individualized medicine. We examined the existing process of BRAF/KIT testing in melanoma patients visiting our institution for oncology consultation. We created 5 working groups, each designing a specific segment of an alternative process that would allow preappointment BRAF/KIT testing and delivery of results. Data were captured and analyzed to evaluate the success of the alternative process. For 1 year, 35 (59%) of 55 patients had prior BRAF/KIT testing. The remaining 20 patients went through the new process of preappointment testing; results were available at the time of appointment for 12 patients (overall preappointment results availability, 85.5%). The overall process averaged 13.4 ± 4.7 days. In conclusion, we describe the successful implementation of a scalable workflow solution that permits preappointment BRAF/KIT mutation analysis and result delivery in melanoma patients. This sets the stage for further applications of this model to other conditions, answering an increasing demand for robust delivery of molecular data for individualized medicine. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Process-driven information management system at a biotech company: concept and implementation.

    PubMed

    Gobbi, Alberto; Funeriu, Sandra; Ioannou, John; Wang, Jinyi; Lee, Man-Ling; Palmer, Chris; Bamford, Bob; Hewitt, Robin

    2004-01-01

    While established pharmaceutical companies have chemical information systems in place to manage their compounds and the associated data, new startup companies need to implement these systems from scratch. Decisions made early in the design phase usually have long lasting effects on the expandability, maintenance effort, and costs associated with the information management system. Careful analysis of work and data flows, both inter- and intradepartmental, and identification of existing dependencies between activities are important. This knowledge is required to implement an information management system, which enables the research community to work efficiently by avoiding redundant registration and processing of data and by timely provision of the data whenever needed. This paper first presents the workflows existing at Anadys, then ARISE, the research information management system developed in-house at Anadys. ARISE was designed to support the preclinical drug discovery process and covers compound registration, analytical quality control, inventory management, high-throughput screening, lower throughput screening, and data reporting.

  11. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    NASA Astrophysics Data System (ADS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-09-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers.

  12. Architecture of next-generation information management systems for digital radiology enterprises

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Wang, Huili; Shen, Weimin; Schmidt, Joachim; Chen, George; Dolan, Tom

    2000-05-01

    Few information systems today offer a clear and flexible means to define and manage the automated part of radiology processes. None of them provide a coherent and scalable architecture that can easily cope with heterogeneity and inevitable local adaptation of applications. Most importantly, they often lack a model that can integrate clinical and administrative information to aid better decisions in managing resources, optimizing operations, and improving productivity. Digital radiology enterprises require cost-effective solutions to deliver information to the right person in the right place and at the right time. We propose a new architecture of image information management systems for digital radiology enterprises. Such a system is based on the emerging technologies in workflow management, distributed object computing, and Java and Web techniques, as well as Philips' domain knowledge in radiology operations. Our design adapts the approach of '4+1' architectural view. In this new architecture, PACS and RIS will become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, it will provide powerful query and statistical functions for managing resources and improving productivity in real time. This work will lead to a new direction of image information management in the next millennium. We will illustrate the innovative design with implemented examples of a working prototype.

  13. A population-based approach for implementing change from opt-out to opt-in research permissions

    PubMed Central

    Oates, Jim C.; Shoaibi, Azza; Obeid, Jihad S.; Habrat, Melissa L.; Warren, Robert W.; Brady, Kathleen T.; Lenert, Leslie A.

    2017-01-01

    Due to recently proposed changes in the Common Rule regarding the collection of research preferences, there is an increased need for efficient methods to document opt-in research preferences at a population level. Previously, our institution developed an opt-out paper-based workflow that could not be utilized for research in a scalable fashion. This project was designed to demonstrate the feasibility of implementing an electronic health record (EHR)-based active opt-in research preferences program. The first phase of implementation required creating and disseminating a patient questionnaire through the EHR portal to populate discreet fields within the EHR indicating patients’ preferences for future research study contact (contact) and their willingness to allow anonymised use of excess tissue and fluid specimens (biobank). In the second phase, the questionnaire was presented within a clinic nurse intake workflow in an obstetrical clinic. These permissions were tabulated in registries for use by investigators for feasibility studies and recruitment. The registry was also used for research patient contact management using a new EHR encounter type to differentiate research from clinical encounters. The research permissions questionnaire was sent to 59,670 patients via the EHR portal. Within four months, 21,814 responses (75% willing to participate in biobanking, and 72% willing to be contacted for future research) were received. Each response was recorded within a patient portal encounter to enable longitudinal analysis of responses. We obtained a significantly lower positive response from the 264 females who completed the questionnaire in the obstetrical clinic (55% volunteers for biobank and 52% for contact). We demonstrate that it is possible to establish a research permissions registry using the EHR portal and clinic-based workflows. This patient-centric, population-based, opt-in approach documents preferences in the EHR, allowing linkage of these preferences to health record information. PMID:28441388

  14. A Coordinated Patient Transport System for ICU Patients Requiring Surgery: Impact on Operating Room Efficiency and ICU Workflow.

    PubMed

    Brown, Michael J; Kor, Daryl J; Curry, Timothy B; Marmor, Yariv; Rohleder, Thomas R

    2015-01-01

    Transfer of intensive care unit (ICU) patients to the operating room (OR) is a resource-intensive, time-consuming process that often results in patient throughput inefficiencies, deficiencies in information transfer, and suboptimal nurse to patient ratios. This study evaluates the implementation of a coordinated patient transport system (CPTS) designed to address these issues. Using data from 1,557 patient transfers covering the 2006-2010 period, interrupted time series and before and after designs were used to analyze the effect of implementing a CPTS at Mayo Clinic, Rochester. Using a segmented regression for the interrupted time series, on-time OR start time deviations were found to be significantly lower after the implementation of CPTS (p < .0001). The implementation resulted in a fourfold improvement in on-time OR starts (p < .01) while significantly reducing idle OR time (p < .01). A coordinated patient transfer process for moving patient from ICUs to ORs can significantly improve OR efficiency, reduce nonvalue added time, and ensure quality of care by preserving appropriate care provider to patient ratios.

  15. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    NASA Astrophysics Data System (ADS)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  16. Enterprise-wide implementation of digital radiography in oral and maxillofacial imaging: the University of Florida Dentistry System.

    PubMed

    Nair, Madhu K; Pettigrew, James C; Loomis, Jeffrey S; Bates, Robert E; Kostewicz, Stephen; Robinson, Boyd; Sweitzer, Jean; Dolan, Teresa A

    2009-06-01

    The implementation of digital radiography in dentistry in a large healthcare enterprise setting is discussed. A distinct need for a dedicated dental picture archiving and communication systems (PACS) exists for seamless integration of different vendor products across the system. Complex issues are contended with as each clinical department migrated to a digital environment with unique needs and workflow patterns. The University of Florida has had a dental PACS installed over 2 years ago. This paper describes the process of conversion from film-based imaging from the planning stages through clinical implementation. Dentistry poses many unique challenges as it strives to achieve better integration with systems primarily designed for imaging; however, the technical requirements for high-resolution image capture in dentistry far exceed those in medicine, as most routine dental diagnostic tasks are challenging. The significance of specification, evaluation, vendor selection, installation, trial runs, training, and phased clinical implementation is emphasized.

  17. Reduction in specimen labeling errors after implementation of a positive patient identification system in phlebotomy.

    PubMed

    Morrison, Aileen P; Tanasijevic, Milenko J; Goonan, Ellen M; Lobo, Margaret M; Bates, Michael M; Lipsitz, Stuart R; Bates, David W; Melanson, Stacy E F

    2010-06-01

    Ensuring accurate patient identification is central to preventing medical errors, but it can be challenging. We implemented a bar code-based positive patient identification system for use in inpatient phlebotomy. A before-after design was used to evaluate the impact of the identification system on the frequency of mislabeled and unlabeled samples reported in our laboratory. Labeling errors fell from 5.45 in 10,000 before implementation to 3.2 in 10,000 afterward (P = .0013). An estimated 108 mislabeling events were prevented by the identification system in 1 year. Furthermore, a workflow step requiring manual preprinting of labels, which was accompanied by potential labeling errors in about one quarter of blood "draws," was removed as a result of the new system. After implementation, a higher percentage of patients reported having their wristband checked before phlebotomy. Bar code technology significantly reduced the rate of specimen identification errors.

  18. Integrating Behavioral Health in Primary Care Using Lean Workflow Analysis: A Case Study

    PubMed Central

    van Eeghen, Constance; Littenberg, Benjamin; Holman, Melissa D.; Kessler, Rodger

    2016-01-01

    Background Primary care offices are integrating behavioral health (BH) clinicians into their practices. Implementing such a change is complex, difficult, and time consuming. Lean workflow analysis may be an efficient, effective, and acceptable method for integration. Objective Observe BH integration into primary care and measure its impact. Design Prospective, mixed methods case study in a primary care practice. Measurements Change in treatment initiation (referrals generating BH visits within the system). Secondary measures: primary care visits resulting in BH referrals, referrals resulting in scheduled appointments, time from referral to scheduled appointment, and time from referral to first visit. Providers and staff were surveyed on the Lean method. Results Referrals increased from 23 to 37/1000 visits (P<.001). Referrals resulted in more scheduled (60% to 74%, P<.001) and arrived visits (44% to 53%, P=.025). Time from referral to first scheduled visit decreased (Hazard Ratio (HR) 1.60; 95% Confidence Interval (CI) 1.37, 1.88; P<0.001) as did time to first arrived visit (HR 1.36; 95% CI 1.14, 1.62; P=0.001). Surveys and comments were positive. Conclusions This pilot integration of BH showed significant improvements in treatment initiation and other measures. Strengths of Lean included workflow improvement, system perspective, and project success. Further evaluation is indicated. PMID:27170796

  19. The Boston Marathon Bombings Mass Casualty Incident: One Emergency Department's Information Systems Challenges and Opportunities.

    PubMed

    Landman, Adam; Teich, Jonathan M; Pruitt, Peter; Moore, Samantha E; Theriault, Jennifer; Dorisca, Elizabeth; Harris, Sheila; Crim, Heidi; Lurie, Nicole; Goralnick, Eric

    2015-07-01

    Emergency department (ED) information systems are designed to support efficient and safe emergency care. These same systems often play a critical role in disasters to facilitate real-time situation awareness, information management, and communication. In this article, we describe one ED's experiences with ED information systems during the April 2013 Boston Marathon bombings. During postevent debriefings, staff shared that our ED information systems and workflow did not optimally support this incident; we found challenges with our unidentified patient naming convention, real-time situational awareness of patient location, and documentation of assessments, orders, and procedures. As a result, before our next mass gathering event, we changed our unidentified patient naming convention to more clearly distinguish multiple, simultaneous, unidentified patients. We also made changes to the disaster registration workflow and enhanced roles and responsibilities for updating electronic systems. Health systems should conduct disaster drills using their ED information systems to identify inefficiencies before an actual incident. ED information systems may require enhancements to better support disasters. Newer technologies, such as radiofrequency identification, could further improve disaster information management and communication but require careful evaluation and implementation into daily ED workflow. Copyright © 2014 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  20. Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less

  1. 3D-printed orthodontic brackets - proof of concept.

    PubMed

    Krey, Karl-Friedrich; Darkazanly, Nawras; Kühnert, Rolf; Ruge, Sebastian

    Today, orthodontic treatment with fixed appliances is usually carried out using preprogrammed straight-wire brackets made of metal or ceramics. The goal of this study was to determine the possibility of clinically implementing a fully digital workflow with individually designed and three-dimensionally printed (3D-printed) brackets. Edgewise brackets were designed using computer-aided design (CAD) software for demonstration purposes. After segmentation of the malocclusion model generated based on intraoral scan data, the brackets were digitally positioned on the teeth and a target occlusion model created. The thus-defined tooth position was used to generate a template for an individualized arch form in the horizontal plane. The base contours of the brackets were modified to match the shape of the tooth surfaces, and a positioning guide (fabricated beforehand) was used to ensure that the brackets were bonded at the correct angle and position. The brackets, positioning guide, and retainer splint, digitally designed on the target occlusion model, were 3D printed using a Digital Light Processing (DLP) 3D printer. The archwires were individually pre-bent using the template. In the treatment sequence, it was shown for the first time that, in principle, it is possible to perform treatment with an individualized 3D-printed brackets system by using the proposed fully digital workflow. Technical aspects of the system, problems encountered in treatment, and possible future developments are discussed in this article.

  2. An Improved Publication Process for the UMVF.

    PubMed

    Renard, Jean-Marie; Brunetaud, Jean-Marc; Cuggia, Marc; Darmoni, Stephan; Lebeux, Pierre; Beuscart, Régis

    2005-01-01

    The "Université Médicale Virtuelle Francophone" (UMVF) is a federation of French medical schools. Its main goal is to share the production and use of pedagogic medical resources generated by academic medical teachers. We developed an Open-Source application based upon a workflow system which provides an improved publication process for the UMVF. For teachers, the tool permits easy and efficient upload of new educational resources. For web masters it provides a mechanism to easily locate and validate the resources. For both the teachers and the web masters, the utility provides the control and communication functions that define a workflow system.For all users, students in particular, the application improves the value of the UMVF repository by providing an easy way to find a detailed description of a resource and to check any resource from the UMVF to ascertain its quality and integrity, even if the resource is an old deprecated version. The server tier of the application is used to implement the main workflow functionalities and is deployed on certified UMVF servers using the PHP language, an LDAP directory and an SQL database. The client tier of the application provides both the workflow and the search and check functionalities and is implemented using a Java applet through a W3C compliant web browser. A unique signature for each resource, was needed to provide security functionality and is implemented using the MD5 Digest algorithm. The testing performed by Rennes and Lille verified the functionality and conformity with our specifications.

  3. Electronic health records and patient safety: co-occurrence of early EHR implementation with patient safety practices in primary care settings.

    PubMed

    Tanner, C; Gans, D; White, J; Nath, R; Pohl, J

    2015-01-01

    The role of electronic health records (EHR) in enhancing patient safety, while substantiated in many studies, is still debated. This paper examines early EHR adopters in primary care to understand the extent to which EHR implementation is associated with the workflows, policies and practices that promote patient safety, as compared to practices with paper records. Early adoption is defined as those who were using EHR prior to implementation of the Meaningful Use program. We utilized the Physician Practice Patient Safety Assessment (PPPSA) to compare primary care practices with fully implemented EHR to those utilizing paper records. The PPPSA measures the extent of adoption of patient safety practices in the domains: medication management, handoffs and transition, personnel qualifications and competencies, practice management and culture, and patient communication. Data from 209 primary care practices responding between 2006-2010 were included in the analysis: 117 practices used paper medical records and 92 used an EHR. Results showed that, within all domains, EHR settings showed significantly higher rates of having workflows, policies and practices that promote patient safety than paper record settings. While these results were expected in the area of medication management, EHR use was also associated with adoption of patient safety practices in areas in which the researchers had no a priori expectations of association. Sociotechnical models of EHR use point to complex interactions between technology and other aspects of the environment related to human resources, workflow, policy, culture, among others. This study identifies that among primary care practices in the national PPPSA database, having an EHR was strongly empirically associated with the workflow, policy, communication and cultural practices recommended for safe patient care in ambulatory settings.

  4. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  5. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-03-01

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  6. Addressing fundamental architectural challenges of an activity-based intelligence and advanced analytics (ABIAA) system

    NASA Astrophysics Data System (ADS)

    Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.

    2015-06-01

    The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.

  7. The effects of hands-free communication device systems: communication changes in hospital organizations

    PubMed Central

    Richardson, Joshua E; Ash, Joan S

    2010-01-01

    Objective To analyze the effects that hands-free communication device (HCD) systems have on healthcare organizations from multiple user perspectives. Design This exploratory qualitative study recruited 26 subjects from multiple departments in two research sites located in Portland, Oregon: an academic medical center and a community hospital. Interview and observation data were gathered January through March, 2007. Measurements Data were analyzed using a grounded theory approach. Because this study was exploratory, data were coded and patterns identified until overall themes ‘emerged’. Results Five themes arose: (1) Communication access—the perception that HCD systems provide fast and efficient communication that supports workflow; (2) Control—social and technical considerations associated with use of an HCD system; (3) Training—processes that should be used to improve use of the HCD system; (4) Organizational change—changes to organizational design and behavior caused by HCD system implementation; and (5) Environment and infrastructure—HCD system use within the context of physical workspaces. Conclusion HCD systems improve communication access but users experience challenges integrating the system into workflow. Effective HCD use depends on how well organizations train users, adapt to changes brought about by HCD systems, and integrate HCD systems into physical surroundings. PMID:20064808

  8. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts

    PubMed Central

    Ravagli, Carlo; Pognan, Francois

    2017-01-01

    Summary: The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. Availability and implementation: The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com. This software is designed to run on a Java EE application server and store data in a relational database. Contact: philippe.marc@novartis.com PMID:27605099

  9. Evaluating the benefits of digital pathology implementation: Time savings in laboratory logistics.

    PubMed

    Baidoshvili, Alexi; Bucur, Anca; van Leeuwen, Jasper; van der Laak, Jeroen; Kluin, Philip; van Diest, Paul J

    2018-06-20

    The benefits of digital pathology for workflow improvement and thereby cost savings in pathology, at least partly outweighing investment costs, are increasingly recognized. Successful implementations in a variety of scenarios start to demonstrate cost benefits of digital pathology for both research and routine diagnostics, contributing to a sound business case encouraging further adoption. To further support new adopters, there is still a need for detailed assessment of the impact this technology has on the relevant pathology workflows with emphasis on time saving. To assess the impact of digital pathology adoption on logistic laboratory tasks (i.e. not including pathologists' time for diagnosis making) in LabPON, a large regional pathology laboratory in The Netherlands. To quantify the benefits of digitization we analyzed the differences between the traditional analog and new digital workflows, carried out detailed measurements of all relevant steps in key analog and digital processes, and compared time spent. We modeled and assessed the logistic savings in five workflows: (1) Routine diagnosis, (2) Multi-disciplinary meeting, (3) External revision requests, (4) Extra stainings and (5) External consultation. On average over 19 working hours were saved on a typical day by working digitally, with the highest savings in routine diagnosis and multi-disciplinary meeting workflows. By working digitally, a significant amount of time could be saved in a large regional pathology lab with a typical case mix. We also present the data in each workflow per task and concrete logistic steps to allow extrapolation to the context and case mix of other laboratories. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Practical issues in implementing whole-genome-sequencing in routine diagnostic microbiology.

    PubMed

    Rossen, J W A; Friedrich, A W; Moran-Gilad, J

    2018-04-01

    Next generation sequencing (NGS) is increasingly being used in clinical microbiology. Like every new technology adopted in microbiology, the integration of NGS into clinical and routine workflows must be carefully managed. To review the practical aspects of implementing bacterial whole genome sequencing (WGS) in routine diagnostic laboratories. Review of the literature and expert opinion. In this review, we discuss when and how to integrate whole genome sequencing (WGS) in the routine workflow of the clinical laboratory. In addition, as the microbiology laboratories have to adhere to various national and international regulations and criteria for their accreditation, we deliberate on quality control issues for using WGS in microbiology, including the importance of proficiency testing. Furthermore, the current and future place of this technology in the diagnostic hierarchy of microbiology is described as well as the necessity of maintaining backwards compatibility with already established methods. Finally, we speculate on the question of whether WGS can entirely replace routine microbiology in the future and the tension between the fact that most sequencers are designed to process multiple samples in parallel whereas for optimal diagnosis a one-by-one processing of the samples is preferred. Special reference is made to the cost and turnaround time of WGS in diagnostic laboratories. Further development is required to improve the workflow for WGS, in particular to shorten the turnaround time, reduce costs, and streamline downstream data analyses. Only when these processes reach maturity will reliance on WGS for routine patient management and infection control management become feasible, enabling the transformation of clinical microbiology into a genome-based and personalized diagnostic field. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. A proof of concept for epidemiological research using structured reporting with pulmonary embolism as a use case.

    PubMed

    Daniel, Pinto Dos Santos; Sonja, Scheibl; Gordon, Arnhold; Aline, Maehringer-Kunz; Christoph, Düber; Peter, Mildenberger; Roman, Kloeckner

    2018-05-10

    This paper studies the possibilities of an integrated IT-based workflow for epidemiological research in pulmonary embolism using freely available tools and structured reporting. We included a total of 521 consecutive cases which had been referred to the radiology department for computed tomography pulmonary angiography (CTPA) with suspected pulmonary embolism (PE). Free-text reports were transformed into structured reports using a freely available IHE-MRRT-compliant reporting platform. D-dimer values were retrieved from the hospitals laboratory results system. All information was stored in the platform's database and visualized using freely available tools. For further analysis, we directly accessed the platform's database with an advanced analytics tool (RapidMiner). We were able developed an integrated workflow for epidemiological statistics from reports obtained in clinical routine. The report data allowed for automated calculation of epidemiological parameters. Prevalence of pulmonary embolism was 27.6%. The mean age in patients with and without pulmonary embolism did not differ (62.8 years and 62.0 years, respectively, p=0.987). As expected, there was a significant difference in mean D-dimer values (10.13 mg/L FEU and 3.12 mg/L FEU, respectively, p<0.001). Structured reporting can make data obtained from clinical routine more accessible. Designing practical workflows is feasible using freely available tools and allows for the calculation of epidemiological statistics on a near real-time basis. Therefore, radiologists should push for the implementation of structured reporting in clinical routine. Advances in knowledge: Theoretical benefits of structured reporting have long been discussed, but practical implementation demonstrating those benefits has been lacking. Here we present a first experience providing proof that structured reporting will make data from clinical routine more accessible.

  12. Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Seah, Chin

    2009-01-01

    During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.

  13. RNA-DNA hybrid (R-loop) immunoprecipitation mapping: an analytical workflow to evaluate inherent biases

    PubMed Central

    Halász, László; Karányi, Zsolt; Boros-Oláh, Beáta; Kuik-Rózsa, Tímea; Sipos, Éva; Nagy, Éva; Mosolygó-L, Ágnes; Mázló, Anett; Rajnavölgyi, Éva; Halmos, Gábor; Székvölgyi, Lóránt

    2017-01-01

    The impact of R-loops on the physiology and pathology of chromosomes has been demonstrated extensively by chromatin biology research. The progress in this field has been driven by technological advancement of R-loop mapping methods that largely relied on a single approach, DNA-RNA immunoprecipitation (DRIP). Most of the DRIP protocols use the experimental design that was developed by a few laboratories, without paying attention to the potential caveats that might affect the outcome of RNA-DNA hybrid mapping. To assess the accuracy and utility of this technology, we pursued an analytical approach to estimate inherent biases and errors in the DRIP protocol. By performing DRIP-sequencing, qPCR, and receiver operator characteristic (ROC) analysis, we tested the effect of formaldehyde fixation, cell lysis temperature, mode of genome fragmentation, and removal of free RNA on the efficacy of RNA-DNA hybrid detection and implemented workflows that were able to distinguish complex and weak DRIP signals in a noisy background with high confidence. We also show that some of the workflows perform poorly and generate random answers. Furthermore, we found that the most commonly used genome fragmentation method (restriction enzyme digestion) led to the overrepresentation of lengthy DRIP fragments over coding ORFs, and this bias was enhanced at the first exons. Biased genome sampling severely compromised mapping resolution and prevented the assignment of precise biological function to a significant fraction of R-loops. The revised workflow presented herein is established and optimized using objective ROC analyses and provides reproducible and highly specific RNA-DNA hybrid detection. PMID:28341774

  14. The impact of secure messaging on workflow in primary care: Results of a multiple-case, multiple-method study.

    PubMed

    Hoonakker, Peter L T; Carayon, Pascale; Cartmill, Randi S

    2017-04-01

    Secure messaging is a relatively new addition to health information technology (IT). Several studies have examined the impact of secure messaging on (clinical) outcomes but very few studies have examined the impact on workflow in primary care clinics. In this study we examined the impact of secure messaging on workflow of clinicians, staff and patients. We used a multiple case study design with multiple data collections methods (observation, interviews and survey). Results show that secure messaging has the potential to improve communication and information flow and the organization of work in primary care clinics, partly due to the possibility of asynchronous communication. However, secure messaging can also have a negative effect on communication and increase workload, especially if patients send messages that are not appropriate for the secure messaging medium (for example, messages that are too long, complex, ambiguous, or inappropriate). Results show that clinicians are ambivalent about secure messaging. Secure messaging can add to their workload, especially if there is high message volume, and currently they are not compensated for these activities. Staff is -especially compared to clinicians- relatively positive about secure messaging and patients are overall very satisfied with secure messaging. Finally, clinicians, staff and patients think that secure messaging can have a positive effect on quality of care and patient safety. Secure messaging is a tool that has the potential to improve communication and information flow. However, the potential of secure messaging to improve workflow is dependent on the way it is implemented and used. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A reliable computational workflow for the selection of optimal screening libraries.

    PubMed

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.

  16. A technology training protocol for meeting QSEN goals: Focusing on meaningful learning.

    PubMed

    Luo, Shuhong; Kalman, Melanie

    2018-01-01

    The purpose of this paper is to describe and discuss how we designed and developed a 12-step technology training protocol. The protocol is meant to improve meaningful learning in technology education so that nursing students are able to meet the informatics requirements of Quality and Safety Education in Nursing competencies. When designing and developing the training protocol, we used a simplified experiential learning model that addressed the core features of meaningful learning: to connect new knowledge with students' prior knowledge and real-world workflow. Before training, we identified students' prior knowledge and workflow tasks. During training, students learned by doing, reflected on their prior computer skills and workflow, designed individualized procedures for integration into their workflow, and practiced the self-designed procedures in real-world settings. The trainer was a facilitator who provided a meaningful learning environment, asked the right questions to guide reflective conversation, and offered scaffoldings at critical moments. This training protocol could significantly improve nurses' competencies in using technologies and increase their desire to adopt new technologies. © 2017 Wiley Periodicals, Inc.

  17. Data processing workflow for time of flight polarized neutrons inelastic measurements

    DOE PAGES

    Savici, Andrei T.; Zaliznyak, Igor A.; Ovidiu Garlea, V.; ...

    2017-06-01

    We discuss the data processing workflow for polarized neutron scattering measurements performed at HYSPEC spectrometer at the Spallation Neutron Source, Oak Ridge National Laboratory. The effects of the focusing Heusler crystal polarizer and the wide-angle supermirror transmission polarization analyzer are added to the data processing flow of the non-polarized case. The implementation is done using the Mantid software package.

  18. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    PubMed

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  19. Distributed information system architecture for Primary Health Care.

    PubMed

    Grammatikou, M; Stamatelopoulos, F; Maglaris, B

    2000-01-01

    We present a distributed architectural framework for Primary Health Care (PHC) Centres. Distribution is handled through the introduction of the Roaming Electronic Health Care Record (R-EHCR) and the use of local caching and incremental update of a global index. The proposed architecture is designed to accommodate a specific PHC workflow model. Finally, we discuss a pilot implementation in progress, which is based on CORBA and web-based user interfaces. However, the conceptual architecture is generic and open to other middleware approaches like the DHE or HL7.

  20. Enabling Real-time Water Decision Support Services Using Model as a Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  1. Proving Value in Radiology: Experience Developing and Implementing a Shareable Open Source Registry Platform Driven by Radiology Workflow.

    PubMed

    Gichoya, Judy Wawira; Kohli, Marc D; Haste, Paul; Abigail, Elizabeth Mills; Johnson, Matthew S

    2017-10-01

    Numerous initiatives are in place to support value based care in radiology including decision support using appropriateness criteria, quality metrics like radiation dose monitoring, and efforts to improve the quality of the radiology report for consumption by referring providers. These initiatives are largely data driven. Organizations can choose to purchase proprietary registry systems, pay for software as a service solution, or deploy/build their own registry systems. Traditionally, registries are created for a single purpose like radiation dosage or specific disease tracking like diabetes registry. This results in a fragmented view of the patient, and increases overhead to maintain such single purpose registry system by requiring an alternative data entry workflow and additional infrastructure to host and maintain multiple registries for different clinical needs. This complexity is magnified in the health care enterprise whereby radiology systems usually are run parallel to other clinical systems due to the different clinical workflow for radiologists. In the new era of value based care where data needs are increasing with demand for a shorter turnaround time to provide data that can be used for information and decision making, there is a critical gap to develop registries that are more adapt to the radiology workflow with minimal overhead on resources for maintenance and setup. We share our experience of developing and implementing an open source registry system for quality improvement and research in our academic institution that is driven by our radiology workflow.

  2. Improved framework model to allocate optimal rainwater harvesting sites in small watersheds for agro-forestry uses

    NASA Astrophysics Data System (ADS)

    Terêncio, D. P. S.; Sanches Fernandes, L. F.; Cortes, R. M. V.; Pacheco, F. A. L.

    2017-07-01

    This study introduces an improved rainwater harvesting (RWH) suitability model to help the implementation of agro-forestry projects (irrigation, wildfire combat) in catchments. The model combines a planning workflow to define suitability of catchments based on physical, socio-economic and ecologic variables, with an allocation workflow to constrain suitable RWH sites as function of project specific features (e.g., distance from rainfall collection to application area). The planning workflow comprises a Multi Criteria Analysis (MCA) implemented on a Geographic Information System (GIS), whereas the allocation workflow is based on a multiple-parameter ranking analysis. When compared to other similar models, improvement comes with the flexible weights of MCA and the entire allocation workflow. The method is tested in a contaminated watershed (the Ave River basin) located in Portugal. The pilot project encompasses the irrigation of a 400 ha crop land that consumes 2.69 Mm3 of water per year. The application of harvested water in the irrigation replaces the use of stream water with excessive anthropogenic nutrients that may raise nitrosamines in the food and accumulation in the food chain, with severe consequences to human health (cancer). The selected rainfall collection catchment is capable to harvest 12 Mm3·yr-1 (≈ 4.5 × the requirement) and is roughly 3 km far from the application area assuring crop irrigation by gravity flow with modest transport costs. The RWH system is an 8-meter high that can be built in earth with reduced costs.

  3. An Internet supported workflow for the publication process in UMVF (French Virtual Medical University).

    PubMed

    Renard, Jean-Marie; Bourde, Annabel; Cuggia, Marc; Garcelon, Nicolas; Souf, Nathalie; Darmoni, Stephan; Beuscart, Régis; Brunetaud, Jean-Marc

    2007-01-01

    The " Université Médicale Virtuelle Francophone" (UMVF) is a federation of French medical schools. Its main goal is to share the production and use of pedagogic medical resources generated by academic medical teachers. We developed an Open-Source application based upon a workflow system, which provides an improved publication process for the UMVF. For teachers, the tool permits easy and efficient upload of new educational resources. For web masters it provides a mechanism to easily locate and validate the resources. For librarian it provide a way to improve the efficiency of indexation. For all, the utility provides a workflow system to control the publication process. On the students side, the application improves the value of the UMVF repository by facilitating the publication of new resources and by providing an easy way to find a detailed description of a resource and to check any resource from the UMVF to ascertain its quality and integrity, even if the resource is an old deprecated version. The server tier of the application is used to implement the main workflow functionalities and is deployed on certified UMVF servers using the PHP language, an LDAP directory and an SQL database. The client tier of the application provides both the workflow and the search and check functionalities. A unique signature for each resource, was needed to provide security functionality and is implemented using a Digest algorithm. The testing performed by Rennes and Lille verified the functionality and conformity with our specifications.

  4. Evidence for electronic health record systems in physical therapy.

    PubMed

    Vreeman, Daniel J; Taggard, Samuel L; Rhine, Michael D; Worrell, Teddy W

    2006-03-01

    With increasing pressures to better manage clinical information, we investigated the role of electronic health record (EHR) systems in physical therapist practice through a critical review of the literature. We reviewed studies that met our predefined criteria after independent review by 3 authors. The investigators in all of the reviewed studies reported benefits, including improved reporting, operational efficiency, interdepartmental communication, data accuracy, and capability for future research. In 7 studies, the investigators reported barriers, including challenges with behavior modification, equipment inadequacy, and training. The investigators in all studies reported key success factors, including end-user participation, adequate training, workflow analysis, and data standardization. This review suggests that EHRs have potential benefits for physical therapists. The authors formed the following recommendations based on the studies' themes: (1) incorporate workflow analysis into system design and implementation; (2) include end users, especially clinicians, in system development; (3) devote significant resources for training; (4) plan and test carefully to ensure adequate software and hardware performance; and (5) commit to data standards.

  5. Metabolomics fingerprint of coffee species determined by untargeted-profiling study using LC-HRMS.

    PubMed

    Souard, Florence; Delporte, Cédric; Stoffelen, Piet; Thévenot, Etienne A; Noret, Nausicaa; Dauvergne, Bastien; Kauffmann, Jean-Michel; Van Antwerpen, Pierre; Stévigny, Caroline

    2018-04-15

    Coffee bean extracts are consumed all over the world as beverage and there is a growing interest in coffee leaf extracts as food supplements. The wild diversity in Coffea (Rubiaceae) genus is large and could offer new opportunities and challenges. In the present work, a metabolomics approach was implemented to examine leaf chemical composition of 9 Coffea species grown in the same environmental conditions. Leaves were analyzed by LC-HRMS and a comprehensive statistical workflow was designed. It served for univariate hypothesis testing and multivariate modeling by PCA and partial PLS-DA on the Workflow4Metabolomics infrastructure. The first two axes of PCA and PLS-DA describes more than 40% of variances with good values of explained variances. This strategy permitted to investigate the metabolomics data and their relation with botanic and genetic informations. Finally, the identification of several key metabolites for the discrimination between species was further characterized. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Workflow and Electronic Health Records in Small Medical Practices

    PubMed Central

    Ramaiah, Mala; Subrahmanian, Eswaran; Sriram, Ram D; Lide, Bettijoyce B

    2012-01-01

    This paper analyzes the workflow and implementation of electronic health record (EHR) systems across different functions in small physician offices. We characterize the differences in the offices based on the levels of computerization in terms of workflow, sources of time delay, and barriers to using EHR systems to support the entire workflow. The study was based on a combination of questionnaires, interviews, in situ observations, and data collection efforts. This study was not intended to be a full-scale time-and-motion study with precise measurements but was intended to provide an overview of the potential sources of delays while performing office tasks. The study follows an interpretive model of case studies rather than a large-sample statistical survey of practices. To identify time-consuming tasks, workflow maps were created based on the aggregated data from the offices. The results from the study show that specialty physicians are more favorable toward adopting EHR systems than primary care physicians are. The barriers to adoption of EHR systems by primary care physicians can be attributed to the complex workflows that exist in primary care physician offices, leading to nonstandardized workflow structures and practices. Also, primary care physicians would benefit more from EHR systems if the systems could interact with external entities. PMID:22737096

  7. Biologically Relevant Heterogeneity: Metrics and Practical Insights

    PubMed Central

    Gough, A; Stern, AM; Maier, J; Lezon, T; Shun, T-Y; Chennubhotla, C; Schurdak, ME; Haney, SA; Taylor, DL

    2017-01-01

    Heterogeneity is a fundamental property of biological systems at all scales that must be addressed in a wide range of biomedical applications including basic biomedical research, drug discovery, diagnostics and the implementation of precision medicine. There are a number of published approaches to characterizing heterogeneity in cells in vitro and in tissue sections. However, there are no generally accepted approaches for the detection and quantitation of heterogeneity that can be applied in a relatively high throughput workflow. This review and perspective emphasizes the experimental methods that capture multiplexed cell level data, as well as the need for standard metrics of the spatial, temporal and population components of heterogeneity. A recommendation is made for the adoption of a set of three heterogeneity indices that can be implemented in any high throughput workflow to optimize the decision-making process. In addition, a pairwise mutual information method is suggested as an approach to characterizing the spatial features of heterogeneity, especially in tissue-based imaging. Furthermore, metrics for temporal heterogeneity are in the early stages of development. Example studies indicate that the analysis of functional phenotypic heterogeneity can be exploited to guide decisions in the interpretation of biomedical experiments, drug discovery, diagnostics and the design of optimal therapeutic strategies for individual patients. PMID:28231035

  8. System, apparatus and methods to implement high-speed network analyzers

    DOEpatents

    Ezick, James; Lethin, Richard; Ros-Giralt, Jordi; Szilagyi, Peter; Wohlford, David E

    2015-11-10

    Systems, apparatus and methods for the implementation of high-speed network analyzers are provided. A set of high-level specifications is used to define the behavior of the network analyzer emitted by a compiler. An optimized inline workflow to process regular expressions is presented without sacrificing the semantic capabilities of the processing engine. An optimized packet dispatcher implements a subset of the functions implemented by the network analyzer, providing a fast and slow path workflow used to accelerate specific processing units. Such dispatcher facility can also be used as a cache of policies, wherein if a policy is found, then packet manipulations associated with the policy can be quickly performed. An optimized method of generating DFA specifications for network signatures is also presented. The method accepts several optimization criteria, such as min-max allocations or optimal allocations based on the probability of occurrence of each signature input bit.

  9. GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction

    NASA Astrophysics Data System (ADS)

    Chrysanthis, Panos K.

    1996-12-01

    Computer Science Department, University of Pittsburgh, Pittsburgh, PA 15260, USA This special issue focuses on current efforts to represent and support workflows that integrate information systems and human resources within a business or manufacturing enterprise. Workflows may also be viewed as an emerging computational paradigm for effective structuring of cooperative applications involving human users and access to diverse data types not necessarily maintained by traditional database management systems. A workflow is an automated organizational process (also called business process) which consists of a set of activities or tasks that need to be executed in a particular controlled order over a combination of heterogeneous database systems and legacy systems. Within workflows, tasks are performed cooperatively by either human or computational agents in accordance with their roles in the organizational hierarchy. The challenge in facilitating the implementation of workflows lies in developing efficient workflow management systems. A workflow management system (also called workflow server, workflow engine or workflow enactment system) provides the necessary interfaces for coordination and communication among human and computational agents to execute the tasks involved in a workflow and controls the execution orderings of tasks as well as the flow of data that these tasks manipulate. That is, the workflow management system is responsible for correctly and reliably supporting the specification, execution, and monitoring of workflows. The six papers selected (out of the twenty-seven submitted for this special issue of Distributed Systems Engineering) address different aspects of these three functional components of a workflow management system. In the first paper, `Correctness issues in workflow management', Kamath and Ramamritham discuss the important issue of correctness in workflow management that constitutes a prerequisite for the use of workflows in the automation of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity. The fourth and fifth papers are experience papers describing a workflow management system and a large scale workflow application, respectively. Schill and Mittasch, in `Workflow management systems on top of OSF DCE and OMG CORBA', describe a decentralized workflow management system and discuss its implementation using two standardized middleware platforms, namely, OSF DCE and OMG CORBA. The system supports a new approach to workflow management, introducing several new concepts such as data type management for integrating various types of data and quality of service for various services provided by servers. A problem common to both database applications and workflows is the handling of missing and incomplete information. This is particularly pervasive in an `electronic market' with a huge number of retail outlets producing and exchanging volumes of data, the application discussed in `Information flow in the DAMA project beyond database managers: information flow managers'. Motivated by the need for a method that allows a task to proceed in a timely manner if not all data produced by other tasks are available by its deadline, Russell et al propose an architectural framework and a language that can be used to detect, approximate and, later on, to adjust missing data if necessary. The final paper, `The evolution towards flexible workflow systems' by Nutt, is complementary to the other papers and is a survey of issues and of work related to both workflow and computer supported collaborative work (CSCW) areas. In particular, the paper provides a model and a categorization of the dimensions which workflow management and CSCW systems share. Besides summarizing the recent advancements towards efficient workflow management, the papers in this special issue suggest areas open to investigation and it is our hope that they will also provide the stimulus for further research and development in the area of workflow management systems.

  10. Modeling of outpatient prescribing process in iran: a gateway toward electronic prescribing system.

    PubMed

    Ahmadi, Maryam; Samadbeik, Mahnaz; Sadoughi, Farahnaz

    2014-01-01

    Implementation of electronic prescribing system can overcome many problems of the paper prescribing system, and provide numerous opportunities of more effective and advantageous prescribing. Successful implementation of such a system requires complete and deep understanding of work content, human force, and workflow of paper prescribing. The current study was designed in order to model the current business process of outpatient prescribing in Iran and clarify different actions during this process. In order to describe the prescribing process and the system features in Iran, the methodology of business process modeling and analysis was used in the present study. The results of the process documentation were analyzed using a conceptual model of workflow elements and the technique of modeling "As-Is" business processes. Analysis of the current (as-is) prescribing process demonstrated that Iran stood at the first levels of sophistication in graduated levels of electronic prescribing, namely electronic prescription reference, and that there were problematic areas including bottlenecks, redundant and duplicated work, concentration of decision nodes, and communicative weaknesses among stakeholders of the process. Using information technology in some activities of medication prescription in Iran has not eliminated the dependence of the stakeholders on paper-based documents and prescriptions. Therefore, it is necessary to implement proper system programming in order to support change management and solve the problems in the existing prescribing process. To this end, a suitable basis should be provided for reorganization and improvement of the prescribing process for the future electronic systems.

  11. Touchscreen questionnaire patient data collection in rheumatology practice: development of a highly successful system using process redesign.

    PubMed

    Newman, Eric D; Lerch, Virginia; Jones, J B; Stewart, Walter

    2012-04-01

    While questionnaires have been developed to capture patient-reported outcomes (PROs) in rheumatology practice, these instruments are not widely used. We developed a touchscreen interface designed to provide reliable and efficient data collection. Using the touchscreen to obtain PROs, we compared 2 different workflow models implemented separately in 2 rheumatology clinics. The Plan-Do-Study-Act methodology was used in 2 cycles of workflow redesign. Cycle 1 relied on off-the-shelf questionnaire builder software, and cycle 2 relied on a custom programmed software solution. During cycle 1, clinic 1 (private practice model, resource replete, simple flow) demonstrated a high completion rate at the start, averaging between 74% and 92% for the first 12 weeks. Clinic 2 (academic model, resource deficient, complex flow) did not achieve a consistent completion rate above 60%. The revised cycle 2 implementation protocol incorporated a 15-minute "nurse visit," an instant messaging system, and a streamlined authentication process, all of which contributed to substantial improvement in touchscreen questionnaire completion rates of ∼80% that were sustained without the need for any additional clinic staff support. Process redesign techniques and touchscreen technology were used to develop a highly successful, efficient, and effective process for the routine collection of PROs in a busy, complex, and resource-depleted academic practice and in typical private practice. The successful implementation required both a touchscreen questionnaire, human behavioral redesign, and other technical solutions. Copyright © 2012 by the American College of Rheumatology.

  12. Computer-aided dental prostheses construction using reverse engineering.

    PubMed

    Solaberrieta, E; Minguez, R; Barrenetxea, L; Sierra, E; Etxaniz, O

    2014-01-01

    The implementation of computer-aided design/computer-aided manufacturing (CAD/CAM) systems with virtual articulators, which take into account the kinematics, constitutes a breakthrough in the construction of customised dental prostheses. This paper presents a multidisciplinary protocol involving CAM techniques to produce dental prostheses. This protocol includes a step-by-step procedure using innovative reverse engineering technologies to transform completely virtual design processes into customised prostheses. A special emphasis is placed on a novel method that permits a virtual location of the models. The complete workflow includes the optical scanning of the patient, the use of reverse engineering software and, if necessary, the use of rapid prototyping to produce CAD temporary prostheses.

  13. Next-generation sequencing meets genetic diagnostics: development of a comprehensive workflow for the analysis of BRCA1 and BRCA2 genes

    PubMed Central

    Feliubadaló, Lídia; Lopez-Doriga, Adriana; Castellsagué, Ester; del Valle, Jesús; Menéndez, Mireia; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Gómez, Carolina; Campos, Olga; Pineda, Marta; González, Sara; Moreno, Victor; Brunet, Joan; Blanco, Ignacio; Serra, Eduard; Capellá, Gabriel; Lázaro, Conxi

    2013-01-01

    Next-generation sequencing (NGS) is changing genetic diagnosis due to its huge sequencing capacity and cost-effectiveness. The aim of this study was to develop an NGS-based workflow for routine diagnostics for hereditary breast and ovarian cancer syndrome (HBOCS), to improve genetic testing for BRCA1 and BRCA2. A NGS-based workflow was designed using BRCA MASTR kit amplicon libraries followed by GS Junior pyrosequencing. Data analysis combined Variant Identification Pipeline freely available software and ad hoc R scripts, including a cascade of filters to generate coverage and variant calling reports. A BRCA homopolymer assay was performed in parallel. A research scheme was designed in two parts. A Training Set of 28 DNA samples containing 23 unique pathogenic mutations and 213 other variants (33 unique) was used. The workflow was validated in a set of 14 samples from HBOCS families in parallel with the current diagnostic workflow (Validation Set). The NGS-based workflow developed permitted the identification of all pathogenic mutations and genetic variants, including those located in or close to homopolymers. The use of NGS for detecting copy-number alterations was also investigated. The workflow meets the sensitivity and specificity requirements for the genetic diagnosis of HBOCS and improves on the cost-effectiveness of current approaches. PMID:23249957

  14. Safety and feasibility of STAT RAD: Improvement of a novel rapid tomotherapy-based radiation therapy workflow by failure mode and effects analysis.

    PubMed

    Jones, Ryan T; Handsfield, Lydia; Read, Paul W; Wilson, David D; Van Ausdal, Ray; Schlesinger, David J; Siebers, Jeffrey V; Chen, Quan

    2015-01-01

    The clinical challenge of radiation therapy (RT) for painful bone metastases requires clinicians to consider both treatment efficacy and patient prognosis when selecting a radiation therapy regimen. The traditional RT workflow requires several weeks for common palliative RT schedules of 30 Gy in 10 fractions or 20 Gy in 5 fractions. At our institution, we have created a new RT workflow termed "STAT RAD" that allows clinicians to perform computed tomographic (CT) simulation, planning, and highly conformal single fraction treatment delivery within 2 hours. In this study, we evaluate the safety and feasibility of the STAT RAD workflow. A failure mode and effects analysis (FMEA) was performed on the STAT RAD workflow, including development of a process map, identification of potential failure modes, description of the cause and effect, temporal occurrence, and team member involvement in each failure mode, and examination of existing safety controls. A risk probability number (RPN) was calculated for each failure mode. As necessary, workflow adjustments were then made to safeguard failure modes of significant RPN values. After workflow alterations, RPN numbers were again recomputed. A total of 72 potential failure modes were identified in the pre-FMEA STAT RAD workflow, of which 22 met the RPN threshold for clinical significance. Workflow adjustments included the addition of a team member checklist, changing simulation from megavoltage CT to kilovoltage CT, alteration of patient-specific quality assurance testing, and allocating increased time for critical workflow steps. After these modifications, only 1 failure mode maintained RPN significance; patient motion after alignment or during treatment. Performing the FMEA for the STAT RAD workflow before clinical implementation has significantly strengthened the safety and feasibility of STAT RAD. The FMEA proved a valuable evaluation tool, identifying potential problem areas so that we could create a safer workflow. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  15. A python framework for environmental model uncertainty analysis

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  16. Making the most of cloud storage - a toolkit for exploitation by WLCG experiments

    NASA Astrophysics Data System (ADS)

    Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea

    2017-10-01

    Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.

  17. Changes in medical errors after implementation of a handoff program.

    PubMed

    Starmer, Amy J; Spector, Nancy D; Srivastava, Rajendu; West, Daniel C; Rosenbluth, Glenn; Allen, April D; Noble, Elizabeth L; Tse, Lisa L; Dalal, Anuj K; Keohane, Carol A; Lipsitz, Stuart R; Rothschild, Jeffrey M; Wien, Matthew F; Yoon, Catherine S; Zigmont, Katherine R; Wilson, Karen M; O'Toole, Jennifer K; Solan, Lauren G; Aylor, Megan; Bismilla, Zia; Coffey, Maitreya; Mahant, Sanjay; Blankenburg, Rebecca L; Destino, Lauren A; Everhart, Jennifer L; Patel, Shilpa J; Bale, James F; Spackman, Jaime B; Stevenson, Adam T; Calaman, Sharon; Cole, F Sessions; Balmer, Dorene F; Hepps, Jennifer H; Lopreiato, Joseph O; Yu, Clifton E; Sectish, Theodore C; Landrigan, Christopher P

    2014-11-06

    Miscommunications are a leading cause of serious medical errors. Data from multicenter studies assessing programs designed to improve handoff of information about patient care are lacking. We conducted a prospective intervention study of a resident handoff-improvement program in nine hospitals, measuring rates of medical errors, preventable adverse events, and miscommunications, as well as resident workflow. The intervention included a mnemonic to standardize oral and written handoffs, handoff and communication training, a faculty development and observation program, and a sustainability campaign. Error rates were measured through active surveillance. Handoffs were assessed by means of evaluation of printed handoff documents and audio recordings. Workflow was assessed through time-motion observations. The primary outcome had two components: medical errors and preventable adverse events. In 10,740 patient admissions, the medical-error rate decreased by 23% from the preintervention period to the postintervention period (24.5 vs. 18.8 per 100 admissions, P<0.001), and the rate of preventable adverse events decreased by 30% (4.7 vs. 3.3 events per 100 admissions, P<0.001). The rate of nonpreventable adverse events did not change significantly (3.0 and 2.8 events per 100 admissions, P=0.79). Site-level analyses showed significant error reductions at six of nine sites. Across sites, significant increases were observed in the inclusion of all prespecified key elements in written documents and oral communication during handoff (nine written and five oral elements; P<0.001 for all 14 comparisons). There were no significant changes from the preintervention period to the postintervention period in the duration of oral handoffs (2.4 and 2.5 minutes per patient, respectively; P=0.55) or in resident workflow, including patient-family contact and computer time. Implementation of the handoff program was associated with reductions in medical errors and in preventable adverse events and with improvements in communication, without a negative effect on workflow. (Funded by the Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, and others.).

  18. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images

    PubMed Central

    2014-01-01

    Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow. PMID:25565494

  19. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.

  20. Chang'E-3 data pre-processing system based on scientific workflow

    NASA Astrophysics Data System (ADS)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  1. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    PubMed

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org.

  2. RetroPath2.0: A retrosynthesis workflow for metabolic engineers.

    PubMed

    Delépine, Baudoin; Duigou, Thomas; Carbonell, Pablo; Faulon, Jean-Loup

    2018-01-01

    Synthetic biology applied to industrial biotechnology is transforming the way we produce chemicals. However, despite advances in the scale and scope of metabolic engineering, the research and development process still remains costly. In order to expand the chemical repertoire for the production of next generation compounds, a major engineering biology effort is required in the development of novel design tools that target chemical diversity through rapid and predictable protocols. Addressing that goal involves retrosynthesis approaches that explore the chemical biosynthetic space. However, the complexity associated with the large combinatorial retrosynthesis design space has often been recognized as the main challenge hindering the approach. Here, we provide RetroPath2.0, an automated open source workflow for retrosynthesis based on generalized reaction rules that perform the retrosynthesis search from chassis to target through an efficient and well-controlled protocol. Its easiness of use and the versatility of its applications make this tool a valuable addition to the biological engineer bench desk. We show through several examples the application of the workflow to biotechnological relevant problems, including the identification of alternative biosynthetic routes through enzyme promiscuity or the development of biosensors. We demonstrate in that way the ability of the workflow to streamline retrosynthesis pathway design and its major role in reshaping the design, build, test and learn pipeline by driving the process toward the objective of optimizing bioproduction. The RetroPath2.0 workflow is built using tools developed by the bioinformatics and cheminformatics community, because it is open source we anticipate community contributions will likely expand further the features of the workflow. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Warpgroup: increased precision of metabolomic data processing by consensus integration bound analysis

    PubMed Central

    Mahieu, Nathaniel G.; Spalding, Jonathan L.; Patti, Gary J.

    2016-01-01

    Motivation: Current informatic techniques for processing raw chromatography/mass spectrometry data break down under several common, non-ideal conditions. Importantly, hydrophilic liquid interaction chromatography (a key separation technology for metabolomics) produces data which are especially challenging to process. We identify three critical points of failure in current informatic workflows: compound specific drift, integration region variance, and naive missing value imputation. We implement the Warpgroup algorithm to address these challenges. Results: Warpgroup adds peak subregion detection, consensus integration bound detection, and intelligent missing value imputation steps to the conventional informatic workflow. When compared with the conventional workflow, Warpgroup made major improvements to the processed data. The coefficient of variation for peaks detected in replicate injections of a complex Escherichia Coli extract were halved (a reduction of 19%). Integration regions across samples were much more robust. Additionally, many signals lost by the conventional workflow were ‘rescued’ by the Warpgroup refinement, thereby resulting in greater analyte coverage in the processed data. Availability and implementation: Warpgroup is an open source R package available on GitHub at github.com/nathaniel-mahieu/warpgroup. The package includes example data and XCMS compatibility wrappers for ease of use. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: nathaniel.mahieu@wustl.edu or gjpattij@wustl.edu PMID:26424859

  4. A strategy for systemic toxicity assessment based on non-animal approaches: The Cosmetics Europe Long Range Science Strategy programme.

    PubMed

    Desprez, Bertrand; Dent, Matt; Keller, Detlef; Klaric, Martina; Ouédraogo, Gladys; Cubberley, Richard; Duplan, Hélène; Eilstein, Joan; Ellison, Corie; Grégoire, Sébastien; Hewitt, Nicola J; Jacques-Jamin, Carine; Lange, Daniela; Roe, Amy; Rothe, Helga; Blaauboer, Bas J; Schepky, Andreas; Mahony, Catherine

    2018-08-01

    When performing safety assessment of chemicals, the evaluation of their systemic toxicity based only on non-animal approaches is a challenging objective. The Safety Evaluation Ultimately Replacing Animal Test programme (SEURAT-1) addressed this question from 2011 to 2015 and showed that further research and development of adequate tools in toxicokinetic and toxicodynamic are required for performing non-animal safety assessments. It also showed how to implement tools like thresholds of toxicological concern (TTCs) and read-across in this context. This paper shows a tiered scientific workflow and how each tier addresses the four steps of the risk assessment paradigm. Cosmetics Europe established its Long Range Science Strategy (LRSS) programme, running from 2016 to 2020, based on the outcomes of SEURAT-1 to implement this workflow. Dedicated specific projects address each step of this workflow, which is introduced here. It tackles the question of evaluating the internal dose when systemic exposure happens. The applicability of the workflow will be shown through a series of case studies, which will be published separately. Even if the LRSS puts the emphasis on safety assessment of cosmetic relevant chemicals, it remains applicable to any type of chemical. Copyright © 2018. Published by Elsevier Ltd.

  5. Workflow Optimization in Vertebrobasilar Occlusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamper, Lars, E-mail: lars.kamper@helios-kliniken.de; Meyn, Hannes; Rybacki, Konrad

    2012-06-15

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients' outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admissionmore » until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.« less

  6. User interface prototype for geospatial early warning systems - a tsunami showcase

    NASA Astrophysics Data System (ADS)

    Hammitzsch, M.; Lendholt, M.; Esbrí, M. Á.

    2012-03-01

    The command and control unit's graphical user interface (GUI) is a central part of early warning systems (EWS) for man-made and natural hazards. The GUI combines and concentrates the relevant information of the system and offers it to human operators. It has to support operators successfully performing their tasks in complex workflows. Most notably in critical situations when operators make important decisions in a limited amount of time, the command and control unit's GUI has to work reliably and stably, providing the relevant information and functionality with the required quality and in time. The design of the GUI application is essential in the development of any EWS to manage hazards effectively. The design and development of such GUI is performed repeatedly for each EWS by various software architects and developers. Implementations differ based on their application in different domains. But similarities designing and equal approaches implementing GUIs of EWS are not quite harmonized enough with related activities and do not exploit possible synergy effects. Thus, the GUI's implementation of an EWS for tsunamis is successively introduced, providing a generic approach to be applied in each EWS for man-made and natural hazards.

  7. Eddy Covariance Method: Overview of General Guidelines and Conventional Workflow

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Anderson, D. J.; Amen, J. L.

    2007-12-01

    Atmospheric flux measurements are widely used to estimate water, heat, carbon dioxide and trace gas exchange between the ecosystem and the atmosphere. The Eddy Covariance method is one of the most direct, defensible ways to measure and calculate turbulent fluxes within the atmospheric boundary layer. However, the method is mathematically complex, and requires significant care to set up and process data. These reasons may be why the method is currently used predominantly by micrometeorologists. Modern instruments and software can potentially expand the use of this method beyond micrometeorology and prove valuable for plant physiology, hydrology, biology, ecology, entomology, and other non-micrometeorological areas of research. The main challenge of the method for a non-expert is the complexity of system design, implementation, and processing of the large volume of data. In the past several years, efforts of the flux networks (e.g., FluxNet, Ameriflux, CarboEurope, Fluxnet-Canada, Asiaflux, etc.) have led to noticeable progress in unification of the terminology and general standardization of processing steps. The methodology itself, however, is difficult to unify, because various experimental sites and different purposes of studies dictate different treatments, and site-, measurement- and purpose-specific approaches. Here we present an overview of theory and typical workflow of the Eddy Covariance method in a format specifically designed to (i) familiarize a non-expert with general principles, requirements, applications, and processing steps of the conventional Eddy Covariance technique, (ii) to assist in further understanding the method through more advanced references such as textbooks, network guidelines and journal papers, (iii) to help technicians, students and new researchers in the field deployment of the Eddy Covariance method, and (iv) to assist in its use beyond micrometeorology. The overview is based, to a large degree, on the frequently asked questions received from new users of the Eddy Covariance method and relevant instrumentation, and employs non-technical language to be of practical use to those new to this field. Information is provided on theory of the method (including state of methodology, basic derivations, practical formulations, major assumptions and sources of errors, error treatment, and use in non- traditional terrains), practical workflow (e.g., experimental design, implementation, data processing, and quality control), alternative methods and applications, and the most frequently overlooked details of the measurements. References and access to an extended 141-page Eddy Covariance Guideline in three electronic formats are also provided.

  8. Improving medical imaging report turnaround times: the role of technolgy.

    PubMed

    Marquez, Luis O; Stewart, Howard

    2005-01-01

    At Southern Ohio Medical Center (SOMC), the medical imaging department and the radiologists expressed a strong desire to improve workflow. The improved workflow was a major motivating factor toward implementing a new RIS and speech recognition technology. The need to monitor workflow in a real-time fashion and to evaluate productivity and resources necessitated that a new solution be found. A decision was made to roll out both the new RIS product and speech recognition to maximize the resources to interface and implement the new solution. Prior to implementation of the new RIS, the medical imaging department operated in a conventional electronic-order-entry to paper request manner. The paper request followed the study through exam completion to the radiologist. SOMC entered into a contract with its PACS vendor to participate in beta testing and clinical trials for a new RIS product for the US market. Backup plans were created in the event the product failed to function as planned--either during the beta testing period or during clinical trails. The last piece of the technology puzzle to improve report turnaround time was voice recognition technology. Speech recognition enhanced the RIS technology as soon as it was implemented. The results show that the project has been a success. The new RIS, combined with speech recognition and the PACS, makes for a very effective solution to patient, exam, and results management in the medical imaging department.

  9. Impact of the digital revolution on the future of pharmaceutical formulation science.

    PubMed

    Leuenberger, Hans; Leuenberger, Michael N

    2016-05-25

    The ongoing digital revolution is no longer limited to the application of apps on the smart phone for daily needs but starts to affect also our professional life in formulation science. The software platform F-CAD (Formulation-Computer Aided Design) of CINCAP can be used to develop and test in silico capsule and tablet formulations. Such an approach allows the pharmaceutical industry to adopt the workflow of the automotive and aircraft industry. Thus, the first prototype of the drug delivery vehicle is prepared virtually by mimicking the composition (particle size distribution of the active drug substance and of the excipients within the tablet) and the process such as direct compression to obtain a defined porosity. The software is based on a cellular automaton (CA) process mimicking the dissolution profile of the capsule or tablet formulation. To take account of the type of dissolution equipment and all SOPs (Standard Operation Procedures) such as a single punch press to manufacture the tablet, a calibration of the F-CAD dissolution profile of the virtual tablet is needed. Thus, the virtual tablet becomes a copy of the real tablet. This statement is valid for all tablets manufactured within the same formulation design space. For this reason, it is important to define already for Clinical Phase I the formulation design space and to work only within this formulation design space consisting of the composition and the processes during all the Clinical Phases. Thus, it is not recommended to start with a simple capsule formulation as service dosage form and to change later to a market ready tablet formulation. The availability of F-CAD is a necessary, but not a sufficient condition to implement the workflow of the automotive and aircraft industry for developing and testing drug delivery vehicles. For a successful implementation of the new workflow, a harmonization of the equipment and the processes between the development and manufacturing departments is a must. In this context, the clinical samples for Clinical Phases I and II should be prepared with a mechanical simulator of the high-speed rotary press used for large batches for Clinical Phases III & IV. If not, the problem of working practically and virtually in different formulation design spaces will remain causing worldwide annually billion of $ losses according to the study of Benson and MacCabe. The harmonization of equipment and processes needs a close cooperation between the industrial pharmacist and the pharmaceutical engineer. In addition, Virtual Equipment Simulators (VESs) of small and large scale equipment for training and computer assisted scale-up would be desirable. A lean and intelligent management information and documentation system will improve the connectivity between the different work stations. Thus, in future, it may be possible to rent at low costs F-CAD as an IT (Information Technology) platform based on a cloud computing solution. By the adoption of the workflow of the automotive and aircraft industry significant savings, a reduced time to market, a lower attrition rate, and a much higher quality of the final marketed dosage form can be achieved. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Creating a comprehensive customer service program to help convey critical and acute results of radiology studies.

    PubMed

    Towbin, Alexander J; Hall, Seth; Moskovitz, Jay; Johnson, Neil D; Donnelly, Lane F

    2011-01-01

    Communication of acute or critical results between the radiology department and referring clinicians has been a deficiency of many radiology departments. The failure to perform or document these communications can lead to poor patient care, patient safety issues, medical-legal issues, and complaints from referring clinicians. To mitigate these factors, a communication and documentation tool was created and incorporated into our departmental customer service program. This article will describe the implementation of a comprehensive customer service program in a hospital-based radiology department. A comprehensive customer service program was created in the radiology department. Customer service representatives were hired to answer the telephone calls to the radiology reading rooms and to help convey radiology results. The radiologists, referring clinicians, and customer service representatives were then linked via a novel workflow management system. This workflow management system provided tools to help facilitate the communication needs of each group. The number of studies with results conveyed was recorded from the implementation of the workflow management system. Between the implementation of the workflow management system on August 1, 2005, and June 1, 2009, 116,844 radiology results were conveyed to the referring clinicians and documented in the system. This accounts for more than 14% of the 828,516 radiology cases performed in this time frame. We have been successful in creating a comprehensive customer service program to convey and document communication of radiology results. This program has been widely used by the ordering clinicians as well as radiologists since its inception.

  11. Design and Execution of make-like, distributed Analyses based on Spotify’s Pipelining Package Luigi

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, B.; Fischer, R.; Rieger, M.

    2017-10-01

    In high-energy particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses and provides a make-like execution system. It is based on the open-source pipelining package Luigi which was developed at Spotify and enables the definition of arbitrary workloads, so-called Tasks, and the dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization in the web. In addition to already built-in features for remote jobs and file systems like Hadoop and HDFS, we added support for WLCG infrastructure such as LSF and CREAM job submission, as well as remote file access through the Grid File Access Library. Furthermore, we implemented automated resubmission functionality, software sandboxing, and a command line interface with auto-completion for a convenient working environment. For the implementation of a t \\overline{{{t}}} H cross section measurement, we created a generic Python interface that provides programmatic access to all external information such as datasets, physics processes, statistical models, and additional files and values. In summary, the setup enables the execution of the entire analysis in a parallelized and distributed fashion with a single command.

  12. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  13. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  14. Information Engineering and Workflow Design in a Clinical Decision Support System for Colorectal Cancer Screening in Iran.

    PubMed

    Maserat, Elham; Seied Farajollah, Seiede Sedigheh; Safdari, Reza; Ghazisaeedi, Marjan; Aghdaei, Hamid Asadzadeh; Zali, Mohammad Reza

    2015-01-01

    Colorectal cancer is a major cause of morbidity and mortality throughout the world. Colorectal cancer screening is an optimal way for reducing of morbidity and mortality and a clinical decision support system (CDSS) plays an important role in predicting success of screening processes. DSS is a computer-based information system that improves the delivery of preventive care services. The aim of this article was to detail engineering of information requirements and work flow design of CDSS for a colorectal cancer screening program. In the first stage a screening minimum data set was determined. Developed and developing countries were analyzed for identifying this data set. Then information deficiencies and gaps were determined by check list. The second stage was a qualitative survey with a semi-structured interview as the study tool. A total of 15 users and stakeholders' perspectives about workflow of CDSS were studied. Finally workflow of DSS of control program was designed by standard clinical practice guidelines and perspectives. Screening minimum data set of national colorectal cancer screening program was defined in five sections, including colonoscopy data set, surgery, pathology, genetics and pedigree data set. Deficiencies and information gaps were analyzed. Then we designed a work process standard of screening. Finally workflow of DSS and entry stage were determined. A CDSS facilitates complex decision making for screening and has key roles in designing optimal interactions between colonoscopy, pathology and laboratory departments. Also workflow analysis is useful to identify data reconciliation strategies to address documentation gaps. Following recommendations of CDSS should improve quality of colorectal cancer screening.

  15. Implementation Recommendations for MOSAIC: A Workflow Architecture for Analytic Enrichment. Analysis and Recommendations for the Implementation of a Cohesive Method for Orchestrating Analytics in a Distributed Model

    DTIC Science & Technology

    2011-02-01

    Process Architecture Technology Analysis: Executive .............................................. 15 UIMA as Executive...44 A.4: Flow Code in UIMA ......................................................................................................... 46... UIMA ................................................................................................................................ 57 E.2

  16. BGDMdocker: a Docker workflow for data mining and visualization of bacterial pan-genomes and biosynthetic gene clusters.

    PubMed

    Cheng, Gong; Lu, Quan; Ma, Ling; Zhang, Guocai; Xu, Liang; Zhou, Zongshan

    2017-01-01

    Recently, Docker technology has received increasing attention throughout the bioinformatics community. However, its implementation has not yet been mastered by most biologists; accordingly, its application in biological research has been limited. In order to popularize this technology in the field of bioinformatics and to promote the use of publicly available bioinformatics tools, such as Dockerfiles and Images from communities, government sources, and private owners in the Docker Hub Registry and other Docker-based resources, we introduce here a complete and accurate bioinformatics workflow based on Docker. The present workflow enables analysis and visualization of pan-genomes and biosynthetic gene clusters of bacteria. This provides a new solution for bioinformatics mining of big data from various publicly available biological databases. The present step-by-step guide creates an integrative workflow through a Dockerfile to allow researchers to build their own Image and run Container easily.

  17. BGDMdocker: a Docker workflow for data mining and visualization of bacterial pan-genomes and biosynthetic gene clusters

    PubMed Central

    Cheng, Gong; Zhang, Guocai; Xu, Liang

    2017-01-01

    Recently, Docker technology has received increasing attention throughout the bioinformatics community. However, its implementation has not yet been mastered by most biologists; accordingly, its application in biological research has been limited. In order to popularize this technology in the field of bioinformatics and to promote the use of publicly available bioinformatics tools, such as Dockerfiles and Images from communities, government sources, and private owners in the Docker Hub Registry and other Docker-based resources, we introduce here a complete and accurate bioinformatics workflow based on Docker. The present workflow enables analysis and visualization of pan-genomes and biosynthetic gene clusters of bacteria. This provides a new solution for bioinformatics mining of big data from various publicly available biological databases. The present step-by-step guide creates an integrative workflow through a Dockerfile to allow researchers to build their own Image and run Container easily. PMID:29204317

  18. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  19. Strategies For Clinical Implementation: Precision Oncology At Three Distinct Institutions.

    PubMed

    Nadauld, Lincoln D; Ford, James M; Pritchard, Daryl; Brown, Thomas

    2018-05-01

    Despite rapid advances in molecular diagnostics and targeted therapeutics, the adoption of precision medicine into clinical oncology workflows has been slow. Questions about clinical utility, inconsistent reimbursement for molecular diagnostics, and limited access to targeted therapies are some of the major hurdles that have hampered clinical adoption. Despite these challenges, providers have invested in precision medicine programs in an ongoing search for innovative care models to deliver improved patient outcomes and achieve economic gains. We describe the precision oncology medicine programs implemented by an integrated delivery system, a community care center, and an academic medical center, to demonstrate the approaches and challenges associated with clinical implementation efforts designed to advance this treatment paradigm. Payer policies that include coverage for broad genomic testing panels would support the broader application of precision medicine, deepen research benefits, and bring targeted therapies to more patients with advanced cancer.

  20. Machine learning for fab automated diagnostics

    NASA Astrophysics Data System (ADS)

    Giollo, Manuel; Lam, Auguste; Gkorou, Dimitra; Liu, Xing Lan; van Haren, Richard

    2017-06-01

    Process optimization depends largely on field engineer's knowledge and expertise. However, this practice turns out to be less sustainable due to the fab complexity which is continuously increasing in order to support the extreme miniaturization of Integrated Circuits. On the one hand, process optimization and root cause analysis of tools is necessary for a smooth fab operation. On the other hand, the growth in number of wafer processing steps is adding a considerable new source of noise which may have a significant impact at the nanometer scale. This paper explores the ability of historical process data and Machine Learning to support field engineers in production analysis and monitoring. We implement an automated workflow in order to analyze a large volume of information, and build a predictive model of overlay variation. The proposed workflow addresses significant problems that are typical in fab production, like missing measurements, small number of samples, confounding effects due to heterogeneity of data, and subpopulation effects. We evaluate the proposed workflow on a real usecase and we show that it is able to predict overlay excursions observed in Integrated Circuits manufacturing. The chosen design focuses on linear and interpretable models of the wafer history, which highlight the process steps that are causing defective products. This is a fundamental feature for diagnostics, as it supports process engineers in the continuous improvement of the production line.

  1. Flexible End2End Workflow Automation of Hit-Discovery Research.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  2. Yadage and Packtivity - analysis preservation using parametrized workflows

    NASA Astrophysics Data System (ADS)

    Cranmer, Kyle; Heinrich, Lukas

    2017-10-01

    Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps - “packtivities” - linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation - “yadage” - capable of executing workflows of analysis preserved via Linux containers.

  3. Lessons from implementing a combined workflow-informatics system for diabetes management.

    PubMed

    Zai, Adrian H; Grant, Richard W; Estey, Greg; Lester, William T; Andrews, Carl T; Yee, Ronnie; Mort, Elizabeth; Chueh, Henry C

    2008-01-01

    Shortcomings surrounding the care of patients with diabetes have been attributed largely to a fragmented, disorganized, and duplicative health care system that focuses more on acute conditions and complications than on managing chronic disease. To address these shortcomings, we developed a diabetes registry population management application to change the way our staff manages patients with diabetes. Use of this new application has helped us coordinate the responsibilities for intervening and monitoring patients in the registry among different users. Our experiences using this combined workflow-informatics intervention system suggest that integrating a chronic disease registry into clinical workflow for the treatment of chronic conditions creates a useful and efficient tool for managing disease.

  4. PACS-Graz, 1985-2000: from a scientific pilot to a state-wide multimedia radiological information system

    NASA Astrophysics Data System (ADS)

    Gell, Guenther

    2000-05-01

    1971/72 began the implementation of a computerized radiological documentation system as the Department of Radiology of the University of Graz, which developed over the years into a full RIS. 1985 started a scientific cooperation with SIEMENS to develop a PACS. The two systems were linked and evolved into a highly integrated RIS-PACS for the state wide hospital system in Styria. During its lifetime the RIS, originally implemented in FORTRAN on a UNIVAC 494 mainframe migrated to a PDP15, on to a PDP11, then VAX and Alphas. The flexible original record structure with variable length fields and the powerful retrieval language were retained. The data acquisition part with the user interface was rewritten several times and many service programs have been added. During our PACS cooperation many ideas like the folder concept or functionalities of the GUI have been designed and tested and were then implemented in the SIENET product. The actual RIS/PACS supports the whole workflow in the Radiology Department. It is installed in a 2.300 bed university hospital and the smaller hospitals of the State of Styria. Modalities from different vendors are connected via DICOM to the RIS (modality worklist) and to the PACS. PACSubsystems from other vendors have been integrated. Images are distributed to referring clinics and for teleconsultation and image processing and reports are available on line to all connected hospitals. We spent great efforts to guarantee optimal support of the workflow and to ensure an enhanced cost/benefit ratio for each user (class). Another special feature is selective image distribution. Using the high level retrieval language individual filters can be constructed easily to implement any image distribution policy agreed upon by radiologists and referring clinicians.

  5. Glocal clinical registries: pacemaker registry design and implementation for global and local integration--methodology and case study.

    PubMed

    da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V

    2013-01-01

    The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries.

  6. An open source software for analysis of dynamic contrast enhanced magnetic resonance images: UMMPerfusion revisited.

    PubMed

    Zöllner, Frank G; Daab, Markus; Sourbron, Steven P; Schad, Lothar R; Schoenberg, Stefan O; Weisser, Gerald

    2016-01-14

    Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft's model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control.

  7. Applying the Karma Provenance tool to NASA's AMSR-E Data Production Stream

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Conover, H.; Regner, K.; Movva, S.; Goodman, H. M.; Pale, B.; Purohit, P.; Sun, Y.

    2010-12-01

    Current procedures for capturing and disseminating provenance, or data product lineage, are limited in both what is captured and how it is disseminated to the science community. For example, the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) Science Investigator-led Processing System (SIPS) generates Level 2 and Level 3 data products for a variety of geophysical parameters. Data provenance and quality information for these data sets is either very general (e.g., user guides, a list of anomalous data receipt and processing conditions over the life of the missions) or difficult to access or interpret (e.g., quality flags embedded in the data, production history files not easily available to users). Karma is a provenance collection and representation tool designed and developed for data driven workflows such as the productions streams used to produce EOS standard products. Karma records uniform and usable provenance metadata independent of the processing system while minimizing both the modification burden on the processing system and the overall performance overhead. Karma collects both the process and data provenance. The process provenance contains information about the workflow execution and the associated algorithm invocations. The data provenance captures metadata about the derivation history of the data product, including algorithms used and input data sources transformed to generate it. As part of an ongoing NASA funded project, Karma is being integrated into the AMSR-E SIPS data production streams. Metadata gathered by the tool will be presented to the data consumers as provenance graphs, which are useful in validating the workflows and determining the quality of the data product. This presentation will discuss design and implementation issues faced while incorporating a provenance tool into a structured data production flow. Prototype results will also be presented in this talk.

  8. Options in virtual 3D, optical-impression-based planning of dental implants.

    PubMed

    Reich, Sven; Kern, Thomas; Ritter, Lutz

    2014-01-01

    If a 3D radiograph, which in today's dentistry often consists of a CBCT dataset, is available for computerized implant planning, the 3D planning should also consider functional prosthetic aspects. In a conventional workflow, the CBCT is done with a specially produced radiopaque prosthetic setup that makes the desired prosthetic situation visible during virtual implant planning. If an exclusively digital workflow is chosen, intraoral digital impressions are taken. On these digital models, the desired prosthetic suprastructures are designed. The entire datasets are virtually superimposed by a "registration" process on the corresponding structures (teeth) in the CBCTs. Thus, both the osseous and prosthetic structures are visible in one single 3D application and make it possible to consider surgical and prosthetic aspects. After having determined the implant positions on the computer screen, a drilling template is designed digitally. According to this design (CAD), a template is printed or milled in CAM process. This template is the first physically extant product in the entire workflow. The article discusses the options and limitations of this workflow.

  9. SU-F-T-617: Remotely Pre-Planned Stereotactic Ablative Radiation Therapy: Validation of Treatment Plan Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juang, T; Bush, K; Loo, B

    Purpose: We propose a workflow to improve access to stereotactic ablative radiation therapy (SABR) for rural patients. When implemented, a separate trip to the central facility for simulation can be eliminated. Two elements are required: (1) Fabrication of custom immobilization devices to match positioning on prior diagnostic CT (dxCT). (2) Remote radiation pre-planning on dxCT, with transfer of contours/plan to simulation CT (simCT) and initiation of treatment same-day or next day. In this retrospective study, we validated part 2 of the workflow using patients already treated with SABR for upper lobe lung tumors. Methods: Target/normal structures were contoured on dxCT;more » a plan was created and approved by the physician. Structures were transferred to simCT using deformable image registration and the plan was re-optimized on simCT. Plan quality was evaluated through comparison to gold-standard structures contoured on simCT and a gold-standard plan based on these structures. Workflow-generated plan quality in this study represents a worst-case scenario as these patients were not treated using custom immobilization to match dxCT position as would be done when the workflow is implemented clinically. Results: 5/6 plans created through the pre-planning workflow were clinically acceptable. For all six plans, the gold-standard GTV received full prescription dose, along with median PTV V95%=95.2% and median PTV D95%=95.4%. Median GTV DSC=0.80, indicating high degree of similarity between the deformed and gold-standard GTV contours despite small GTV sizes (mean=3.0cc). One outlier (DSC=0.49) resulted in inadequate PTV coverage (V95%=62.9%) in the workflow plan; in clinical practice, this mismatch between deformed/gold-standard GTV would be revised by the physician after deformable registration. For all patients, normal tissue doses were comparable to the gold-standard plan and well within constraints. Conclusion: Pre-planning SABR cases on diagnostic imaging generated clinically acceptable plans. Coupled with rapid-prototyped custom immobilization, this workflow may improve treatment access for rural patients.« less

  10. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    PubMed Central

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org. PMID:22559942

  11. An Implementation-Focused Bio/Algorithmic Workflow for Synthetic Biology.

    PubMed

    Goñi-Moreno, Angel; Carcajona, Marta; Kim, Juhyun; Martínez-García, Esteban; Amos, Martyn; de Lorenzo, Víctor

    2016-10-21

    As synthetic biology moves away from trial and error and embraces more formal processes, workflows have emerged that cover the roadmap from conceptualization of a genetic device to its construction and measurement. This latter aspect (i.e., characterization and measurement of synthetic genetic constructs) has received relatively little attention to date, but it is crucial for their outcome. An end-to-end use case for engineering a simple synthetic device is presented, which is supported by information standards and computational methods and focuses on such characterization/measurement. This workflow captures the main stages of genetic device design and description and offers standardized tools for both population-based measurement and single-cell analysis. To this end, three separate aspects are addressed. First, the specific vector features are discussed. Although device/circuit design has been successfully automated, important structural information is usually overlooked, as in the case of plasmid vectors. The use of the Standard European Vector Architecture (SEVA) is advocated for selecting the optimal carrier of a design and its thorough description in order to unequivocally correlate digital definitions and molecular devices. A digital version of this plasmid format was developed with the Synthetic Biology Open Language (SBOL) along with a software tool that allows users to embed genetic parts in vector cargoes. This enables annotation of a mathematical model of the device's kinetic reactions formatted with the Systems Biology Markup Language (SBML). From that point onward, the experimental results and their in silico counterparts proceed alongside, with constant feedback to preserve consistency between them. A second aspect involves a framework for the calibration of fluorescence-based measurements. One of the most challenging endeavors in standardization, metrology, is tackled by reinterpreting the experimental output in light of simulation results, allowing us to turn arbitrary fluorescence units into relative measurements. Finally, integration of single-cell methods into a framework for multicellular simulation and measurement is addressed, allowing standardized inspection of the interplay between the carrier chassis and the culture conditions.

  12. The impact of using an intravenous workflow management system (IVWMS) on cost and patient safety.

    PubMed

    Lin, Alex C; Deng, Yihong; Thaibah, Hilal; Hingl, John; Penm, Jonathan; Ivey, Marianne F; Thomas, Mark

    2018-07-01

    The aim of this study was to determine the financial costs associated with wasted and missing doses before and after the implementation of an intravenous workflow management system (IVWMS) and to quantify the number and the rate of detected intravenous (IV) preparation errors. A retrospective analysis of the sample hospital information system database was conducted using three months of data before and after the implementation of an IVWMS System (DoseEdge ® ) which uses barcode scanning and photographic technologies to track and verify each step of the preparation process. The financial impact associated with wasted and missing >IV doses was determined by combining drug acquisition, labor, accessory, and disposal costs. The intercepted error reports and pharmacist detected error reports were drawn from the IVWMS to quantify the number of errors by defined error categories. The total number of IV doses prepared before and after the implementation of the IVWMS system were 110,963 and 101,765 doses, respectively. The adoption of the IVWMS significantly reduced the amount of wasted and missing IV doses by 14,176 and 2268 doses, respectively (p < 0.001). The overall cost savings of using the system was $144,019 over 3 months. The total number of errors detected was 1160 (1.14%) after using the IVWMS. The implementation of the IVWMS facilitated workflow changes that led to a positive impact on cost and patient safety. The implementation of the IVWMS increased patient safety by enforcing standard operating procedures and bar code verifications. Published by Elsevier B.V.

  13. A near miss: the importance of context in a public health informatics project in a New Zealand case study.

    PubMed

    Wells, Stewart; Bullen, Chris

    2008-01-01

    This article describes the near failure of an information technology (IT) system designed to support a government-funded, primary care-based hepatitis B screening program in New Zealand. Qualitative methods were used to collect data and construct an explanatory model. Multiple incorrect assumptions were made about participants, primary care workflows and IT capacity, software vendor user knowledge, and the health IT infrastructure. Political factors delayed system development and it was implemented untested, almost failing. An intensive rescue strategy included system modifications, relaxation of data validity rules, close engagement with software vendors, and provision of intensive on-site user support. This case study demonstrates that consideration of the social, political, technological, and health care contexts is important for successful implementation of public health informatics projects.

  14. Py4CAtS - Python tools for line-by-line modelling of infrared atmospheric radiative transfer

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; García, Sebastián Gimeno

    2013-05-01

    Py4CAtS — Python scripts for Computational ATmospheric Spectroscopy is a Python re-implementation of the Fortran infrared radiative transfer code GARLIC, where compute-intensive code sections utilize the Numeric/Scientific Python modules for highly optimized array-processing. The individual steps of an infrared or microwave radiative transfer computation are implemented in separate scripts to extract lines of relevant molecules in the spectral range of interest, to compute line-by-line cross sections for given pressure(s) and temperature(s), to combine cross sections to absorption coefficients and optical depths, and to integrate along the line-of-sight to transmission and radiance/intensity. The basic design of the package, numerical and computational aspects relevant for optimization, and a sketch of the typical workflow are presented.

  15. Meta-manager: a requirements analysis.

    PubMed

    Cook, J F; Rozenblit, J W; Chacko, A K; Martinez, R; Timboe, H L

    1999-05-01

    The digital imaging network-picture-archiving and communications system (DIN-PACS) will be implemented in ten sites within the Great Plains Regional Medical Command (GPRMC). This network of PACS and teleradiology technology over a shared T1 network has opened the door for round the clock radiology coverage of all sites. However, the concept of a virtual radiology environment poses new issues for military medicine. A new workflow management system must be developed. This workflow management system will allow us to efficiently resolve these issues including quality of care, availability, severe capitation, and quality of the workforce. The design process of this management system must employ existing technology, operate over various telecommunication networks and protocols, be independent of platform operating systems, be flexible and scaleable, and involve the end user at the outset in the design process for which it is developed. Using the unified modeling language (UML), the specifications for this new business management system were created in concert between the University of Arizona and the GPRMC. These specifications detail a management system operating through a common object request brokered architecture (CORBA) environment. In this presentation, we characterize the Meta-Manager management system including aspects of intelligence, interfacility routing, fail-safe operations, and expected improvements in patient care and efficiency.

  16. A computational workflow for designing silicon donor qubits

    DOE PAGES

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...

    2016-09-19

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less

  17. Implementation of a single sign-on system between practice, research and learning systems.

    PubMed

    Purkayastha, Saptarshi; Gichoya, Judy W; Addepally, Siva Abhishek

    2017-03-29

    Multiple specialized electronic medical systems are utilized in the health enterprise. Each of these systems has their own user management, authentication and authorization process, which makes it a complex web for navigation and use without a coherent process workflow. Users often have to remember multiple passwords, login/logout between systems that disrupt their clinical workflow. Challenges exist in managing permissions for various cadres of health care providers. This case report describes our experience of implementing a single sign-on system, used between an electronic medical records system and a learning management system at a large academic institution with an informatics department responsible for student education and a medical school affiliated with a hospital system caring for patients and conducting research. At our institution, we use OpenMRS for research registry tracking of interventional radiology patients as well as to provide access to medical records to students studying health informatics. To provide authentication across different users of the system with different permissions, we developed a Central Authentication Service (CAS) module for OpenMRS, released under the Mozilla Public License and deployed it for single sign-on across the academic enterprise. The module has been in implementation since August 2015 to present, and we assessed usability of the registry and education system before and after implementation of the CAS module. 54 students and 3 researchers were interviewed. The module authenticates users with appropriate privileges in the medical records system, providing secure access with minimal disruption to their workflow. No passwords requests were sent and users reported ease of use, with streamlined workflow. The project demonstrates that enterprise-wide single sign-on systems should be used in healthcare to reduce complexity like "password hell", improve usability and user navigation. We plan to extend this to work with other systems used in the health care enterprise.

  18. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  19. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    NASA Astrophysics Data System (ADS)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.

  20. Measuring the impact of a health information exchange intervention on provider-based notifiable disease reporting using mixed methods: a study protocol.

    PubMed

    Dixon, Brian E; Grannis, Shaun J; Revere, Debra

    2013-10-30

    Health information exchange (HIE) is the electronic sharing of data and information between clinical care and public health entities. Previous research has shown that using HIE to electronically report laboratory results to public health can improve surveillance practice, yet there has been little utilization of HIE for improving provider-based disease reporting. This article describes a study protocol that uses mixed methods to evaluate an intervention to electronically pre-populate provider-based notifiable disease case reporting forms with clinical, laboratory and patient data available through an operational HIE. The evaluation seeks to: (1) identify barriers and facilitators to implementation, adoption and utilization of the intervention; (2) measure impacts on workflow, provider awareness, and end-user satisfaction; and (3) describe the contextual factors that impact the effectiveness of the intervention within heterogeneous clinical settings and the HIE. The intervention will be implemented over a staggered schedule in one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care. Evaluation will be conducted utilizing a concurrent design mixed methods framework in which qualitative methods are embedded within the quantitative methods. Quantitative data will include reporting rates, timeliness and burden and report completeness and accuracy, analyzed using interrupted time-series and other pre-post comparisons. Qualitative data regarding pre-post provider perceptions of report completeness, accuracy, and timeliness, reporting burden, data quality, benefits, utility, adoption, utilization and impact on reporting workflow will be collected using semi-structured interviews and open-ended survey items. Data will be triangulated to find convergence or agreement by cross-validating results to produce a contextualized portrayal of the facilitators and barriers to implementation and use of the intervention. By applying mixed research methods and measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community.

  1. Reusable, extensible, and modifiable R scripts and Kepler workflows for comprehensive single set ChIP-seq analysis.

    PubMed

    Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark

    2016-07-05

    There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.

  2. Context-aware workflow management of mobile health applications.

    PubMed

    Salden, Alfons; Poortinga, Remco

    2006-01-01

    We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.

  3. Streamling the Change Management with Business Rules

    NASA Technical Reports Server (NTRS)

    Savela, Christopher

    2015-01-01

    Will discuss how their organization is trying to streamline workflows and the change management process with business rules. In looking for ways to make things more efficient and save money one way is to reduce the work the workflow task approvers have to do when reviewing affected items. Will share the technical details of the business rules, how to implement them, how to speed up the development process by using the API to demonstrate the rules in action.

  4. REMORA: a pilot in the ocean of BioMoby web-services.

    PubMed

    Carrere, Sébastien; Gouzy, Jérôme

    2006-04-01

    Emerging web-services technology allows interoperability between multiple distributed architectures. Here, we present REMORA, a web server implemented according to the BioMoby web-service specifications, providing life science researchers with an easy-to-use workflow generator and launcher, a repository of predefined workflows and a survey system. Jerome.Gouzy@toulouse.inra.fr The REMORA web server is freely available at http://bioinfo.genopole-toulouse.prd.fr/remora, sources are available upon request from the authors.

  5. Research on a dynamic workflow access control model

    NASA Astrophysics Data System (ADS)

    Liu, Yiliang; Deng, Jinxia

    2007-12-01

    In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.

  6. First installation of a dual-room IVR-CT system in the emergency room.

    PubMed

    Wada, Daiki; Nakamori, Yasushi; Kanayama, Shuji; Maruyama, Shuhei; Kawada, Masahiro; Iwamura, Hiromu; Hayakawa, Koichi; Saito, Fukuki; Kuwagata, Yasuyuki

    2018-03-05

    Computed tomography (CT) embedded in the emergency room has gained importance in the early diagnostic phase of trauma care. In 2011, we implemented a new trauma workflow concept with a sliding CT scanner system with interventional radiology features (IVR-CT) that allows CT examination and emergency therapeutic intervention without relocating the patient, which we call the Hybrid emergency room (Hybrid ER). In the Hybrid ER, all life-saving procedures, CT examination, damage control surgery, and transcatheter arterial embolisation can be performed on the same table. Although the trauma workflow realized in the Hybrid ER may improve mortality in severe trauma, the Hybrid ER can potentially affect the efficacy of other in/outpatient diagnostic workflow because one room is occupied by one severely injured patient undergoing both emergency trauma care and CT scanning for long periods. In July 2017, we implemented a new trauma workflow concept with a dual-room sliding CT scanner system with interventional radiology features (dual-room IVR-CT) to increase patient throughput. When we perform emergency surgery or interventional radiology for a severely injured or ill patient in the Hybrid ER, the sliding CT scanner moves to the adjacent CT suite, and we can perform CT scanning of another in/outpatient. We believe that dual-room IVR-CT can contribute to the improvement of both the survival of severely injured or ill patients and patient throughput.

  7. SQL Collaborative Learning Framework Based on SOA

    NASA Astrophysics Data System (ADS)

    Armiati, S.; Awangga, RM

    2018-04-01

    The research is focused on designing collaborative learning-oriented framework fulfilment service in teaching SQL Oracle 10g. Framework built a foundation of academic fulfilment service performed by a layer of the working unit in collaboration with Program Studi Manajemen Informatika. In the design phase defined what form of collaboration models and information technology proposed for Program Studi Manajemen Informatika by using a framework of collaboration inspired by the stages of modelling a Service Oriented Architecture (SOA). Stages begin with analyzing subsystems, this activity is used to determine subsystem involved and reliance as well as workflow between the subsystems. After the service can be identified, the second phase is designing the component specifications, which details the components that are implemented in the service to include the data, rules, services, profiles can be configured, and variations. The third stage is to allocate service, set the service to the subsystems that have been identified, and its components. Implementation framework contributes to the teaching guides and application architecture that can be used as a landing realize an increase in service by applying information technology.

  8. The development and implementation of MOSAIQ Integration Platform (MIP) based on the radiotherapy workflow

    NASA Astrophysics Data System (ADS)

    Yang, Xin; He, Zhen-yu; Jiang, Xiao-bo; Lin, Mao-sheng; Zhong, Ning-shan; Hu, Jiang; Qi, Zhen-yu; Bao, Yong; Li, Qiao-qiao; Li, Bao-yue; Hu, Lian-ying; Lin, Cheng-guang; Gao, Yuan-hong; Liu, Hui; Huang, Xiao-yan; Deng, Xiao-wu; Xia, Yun-fei; Liu, Meng-zhong; Sun, Ying

    2017-03-01

    To meet the special demands in China and the particular needs for the radiotherapy department, a MOSAIQ Integration Platform CHN (MIP) based on the workflow of radiation therapy (RT) has been developed, as a supplement system to the Elekta MOSAIQ. The MIP adopts C/S (client-server) structure mode, and its database is based on the Treatment Planning System (TPS) and MOSAIQ SQL Server 2008, running on the hospital local network. Five network servers, as a core hardware, supply data storage and network service based on the cloud services. The core software, using C# programming language, is developed based on Microsoft Visual Studio Platform. The MIP server could offer network service, including entry, query, statistics and print information for about 200 workstations at the same time. The MIP was implemented in the past one and a half years, and some practical patient-oriented functions were developed. And now the MIP is almost covering the whole workflow of radiation therapy. There are 15 function modules, such as: Notice, Appointment, Billing, Document Management (application/execution), System Management, and so on. By June of 2016, recorded data in the MIP are as following: 13546 patients, 13533 plan application, 15475 RT records, 14656 RT summaries, 567048 billing records and 506612 workload records, etc. The MIP based on the RT workflow has been successfully developed and clinically implemented with real-time performance, data security, stable operation. And it is demonstrated to be user-friendly and is proven to significantly improve the efficiency of the department. It is a key to facilitate the information sharing and department management. More functions can be added or modified for further enhancement its potentials in research and clinical practice.

  9. Patient-Centered Collaborative Care: The Impact of a New Approach to Postpartum Rounds on Residents' Perception of Their Work Environment

    PubMed Central

    Baldwin, Maureen; Hashima, Jason; Guise, Jeanne-Marie; Gregory, William Thomas; Edelman, Alison; Segel, Sally

    2010-01-01

    Objective At our institution, traditional postpartum rounds consisted of separate visits from all members of the obstetric team. This led to patient care inefficiencies and miscommunication. In an effort to improve patient care, patient-centered collaborative care (PCCC) was established, whereby physicians, residents, medical students, nurses, case managers, and social workers conduct rounds as a team. The goal of this observational study was to evaluate how PCCC rounds affected resident physicians' assessment of their work environment. Methods Obstetrics and gynecology residents completed a 13-question written survey designed to assess their sense of workflow, education, and workplace cohesion. Surveys were completed before and 6 months after the implementation of PCCC. Responses were compared in aggregate for preintervention and postintervention with Pearson χ2 test. Results Ninety-two percent of the obstetrics residents (n  =  23) completed the preintervention survey, and 79% (n  =  19) completed the postintervention survey. For most measures, there was no difference in resident perception between the 2 time points. After implementation of PCCC rounds, fewer residents felt that rounds were educational (preintervention  =  39%, postintervention  =  7%; P  =  .03). Conclusion Residents did not report negative impacts on workflow, cohesion, or general well-being after the implementation of PCCC rounds. However, there was a perception that PCCC rounds negatively impacted the educational value of postpartum rounds. This information will help identify ways to improve the resident physician experience in the obstetric service while optimizing patient care. PMID:21975886

  10. Capacity building in safe nanotechnologies

    NASA Astrophysics Data System (ADS)

    Keller, Markus; Gommel, Udo

    2011-07-01

    In all places where engineered Nanoparticles (ENPs) are produced, used or handled, adequate workplace safety precautions should be implemented due to the protection of workers and the surrounding environment. Any possible accidental release of ENPs should be evaluated. Thereby detected potential risks have to be eliminated as far as possible. An implemented reasonable safety culture in each ENP-related company will help to meet this challenge. Different infrastructures and workplace design can help to reduce the risk of an accidentally contact of the workers with ENPs: Transferable examples will be shown from the semiconductor and life-science Industry. These systems like clean rooms, glove boxes, fume cupboards, filter and suction systems and other restricted area barrier access systems (RABS) are mainly being developed to protect sensitive products, but they can also be used to protect working personnel. Clean environments regarding airborne particulate contaminations can be classified according to ISO 14644-1. A short insight into this ISO-classification will be given. But overall, a simple and reasonable workplace and workflow organization will reduce the risk of an accidental release of ENPs largely. This may lead to a therefore necessary adaption of existing workflow patterns. The workers have to get aware about the potential risks! This can be done with appropriate education materials, leaflets, posters and brochures. These are some of the later outcomes from the NanoDevice dissemination and handbook work package.

  11. Destination bedside: using research findings to visualize optimal unit layouts and health information technology in support of bedside care.

    PubMed

    Watkins, Nicholas; Kennedy, Mary; Lee, Nelson; O'Neill, Michael; Peavey, Erin; Ducharme, Maria; Padula, Cynthia

    2012-05-01

    This study explored the impact of unit design and healthcare information technology (HIT) on nursing workflow and patient-centered care (PCC). Healthcare information technology and unit layout-related predictors of nursing workflow and PCC were measured during a 3-phase study involving questionnaires and work sampling methods. Stepwise multiple linear regressions demonstrated several HIT and unit layout-related factors that impact nursing workflow and PCC.

  12. Using location tracking data to assess efficiency in established clinical workflows.

    PubMed

    Meyer, Mark; Fairbrother, Pamela; Egan, Marie; Chueh, Henry; Sandberg, Warren S

    2006-01-01

    Location tracking systems are becoming more prevalent in clinical settings yet applications still are not common. We have designed a system to aid in the assessment of clinical workflow efficiency. Location data is captured from active RFID tags and processed into usable data. These data are stored and presented visually with trending capability over time. The system allows quick assessments of the impact of process changes on workflow, and isolates areas for improvement.

  13. Design-based stereology: Planning, volumetry and sampling are crucial steps for a successful study.

    PubMed

    Tschanz, Stefan; Schneider, Jan Philipp; Knudsen, Lars

    2014-01-01

    Quantitative data obtained by means of design-based stereology can add valuable information to studies performed on a diversity of organs, in particular when correlated to functional/physiological and biochemical data. Design-based stereology is based on a sound statistical background and can be used to generate accurate data which are in line with principles of good laboratory practice. In addition, by adjusting the study design an appropriate precision can be achieved to find relevant differences between groups. For the success of the stereological assessment detailed planning is necessary. In this review we focus on common pitfalls encountered during stereological assessment. An exemplary workflow is included, and based on authentic examples, we illustrate a number of sampling principles which can be implemented to obtain properly sampled tissue blocks for various purposes. Copyright © 2013 Elsevier GmbH. All rights reserved.

  14. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  15. Workflows and Provenance: Toward Information Science Solutions for the Natural Sciences.

    PubMed

    Gryk, Michael R; Ludäscher, Bertram

    2017-01-01

    The era of big data and ubiquitous computation has brought with it concerns about ensuring reproducibility in this new research environment. It is easy to assume computational methods self-document by their very nature of being exact, deterministic processes. However, similar to laboratory experiments, ensuring reproducibility in the computational realm requires the documentation of both the protocols used (workflows) as well as a detailed description of the computational environment: algorithms, implementations, software environments as well as the data ingested and execution logs of the computation. These two aspects of computational reproducibility (workflows and execution details) are discussed in the context of biomolecular Nuclear Magnetic Resonance spectroscopy (bioNMR) as well as the PRIMAD model for computational reproducibility.

  16. How Can We Better Detect Unauthorized GMOs in Food and Feed Chains?

    PubMed

    Fraiture, Marie-Alice; Herman, Philippe; De Loose, Marc; Debode, Frédéric; Roosens, Nancy H

    2017-06-01

    Current GMO detection systems have limited abilities to detect unauthorized genetically modified organisms (GMOs). Here, we propose a new workflow, based on next-generation sequencing (NGS) technology, to overcome this problem. In providing information about DNA sequences, this high-throughput workflow can distinguish authorized and unauthorized GMOs by strengthening the tools commonly used by enforcement laboratories with the help of NGS technology. In addition, thanks to its massive sequencing capacity, this workflow could be used to monitor GMOs present in the food and feed chain. In view of its potential implementation by enforcement laboratories, we discuss this innovative approach, its current limitations, and its sustainability of use over time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Implementation and Challenges of Direct Acoustic Dosing into Cell-Based Assays.

    PubMed

    Roberts, Karen; Callis, Rowena; Ikeda, Tim; Paunovic, Amalia; Simpson, Carly; Tang, Eric; Turton, Nick; Walker, Graeme

    2016-02-01

    Since the adoption of Labcyte Echo Acoustic Droplet Ejection (ADE) technology by AstraZeneca in 2005, ADE has become the preferred method for compound dosing into both biochemical and cell-based assays across AstraZeneca research and development globally. The initial implementation of Echos and the direct dosing workflow provided AstraZeneca with a unique set of challenges. In this article, we outline how direct Echo dosing has evolved over the past decade in AstraZeneca. We describe the practical challenges of applying ADE technology to 96-well, 384-well, and 1536-well assays and how AstraZeneca developed and applied software and robotic solutions to generate fully automated and effective cell-based assay workflows. © 2015 Society for Laboratory Automation and Screening.

  18. Dashboard visualizations: Supporting real-time throughput decision-making.

    PubMed

    Franklin, Amy; Gantela, Swaroop; Shifarraw, Salsawit; Johnson, Todd R; Robinson, David J; King, Brent R; Mehta, Amit M; Maddow, Charles L; Hoot, Nathan R; Nguyen, Vickie; Rubio, Adriana; Zhang, Jiajie; Okafor, Nnaemeka G

    2017-07-01

    Providing timely and effective care in the emergency department (ED) requires the management of individual patients as well as the flow and demands of the entire department. Strategic changes to work processes, such as adding a flow coordination nurse or a physician in triage, have demonstrated improvements in throughput times. However, such global strategic changes do not address the real-time, often opportunistic workflow decisions of individual clinicians in the ED. We believe that real-time representation of the status of the entire emergency department and each patient within it through information visualizations will better support clinical decision-making in-the-moment and provide for rapid intervention to improve ED flow. This notion is based on previous work where we found that clinicians' workflow decisions were often based on an in-the-moment local perspective, rather than a global perspective. Here, we discuss the challenges of designing and implementing visualizations for ED through a discussion of the development of our prototype Throughput Dashboard and the potential it holds for supporting real-time decision-making. Copyright © 2017. Published by Elsevier Inc.

  19. Targeted proteomics coming of age - SRM, PRM and DIA performance evaluated from a core facility perspective.

    PubMed

    Kockmann, Tobias; Trachsel, Christian; Panse, Christian; Wahlander, Asa; Selevsek, Nathalie; Grossmann, Jonas; Wolski, Witold E; Schlapbach, Ralph

    2016-08-01

    Quantitative mass spectrometry is a rapidly evolving methodology applied in a large number of omics-type research projects. During the past years, new designs of mass spectrometers have been developed and launched as commercial systems while in parallel new data acquisition schemes and data analysis paradigms have been introduced. Core facilities provide access to such technologies, but also actively support the researchers in finding and applying the best-suited analytical approach. In order to implement a solid fundament for this decision making process, core facilities need to constantly compare and benchmark the various approaches. In this article we compare the quantitative accuracy and precision of current state of the art targeted proteomics approaches single reaction monitoring (SRM), parallel reaction monitoring (PRM) and data independent acquisition (DIA) across multiple liquid chromatography mass spectrometry (LC-MS) platforms, using a readily available commercial standard sample. All workflows are able to reproducibly generate accurate quantitative data. However, SRM and PRM workflows show higher accuracy and precision compared to DIA approaches, especially when analyzing low concentrated analytes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. The GridEcon Platform: A Business Scenario Testbed for Commercial Cloud Services

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn; Guo, Li; Fleming, Alan; Courcoubetis, Costas

    Within this paper, we present the GridEcon Platform, a testbed for designing and evaluating economics-aware services in a commercial Cloud computing setting. The Platform is based on the idea that the exact working of such services is difficult to predict in the context of a market and, therefore, an environment for evaluating its behavior in an emulated market is needed. To identify the components of the GridEcon Platform, a number of economics-aware services and their interactions have been envisioned. The two most important components of the platform are the Marketplace and the Workflow Engine. The Workflow Engine allows the simple composition of a market environment by describing the service interactions between economics-aware services. The Marketplace allows trading goods using different market mechanisms. The capabilities of these components of the GridEcon Platform in conjunction with the economics-aware services are described in this paper in detail. The validation of an implemented market mechanism and a capacity planning service using the GridEcon Platform also demonstrated the usefulness of the GridEcon Platform.

  1. Experts speak: advice from key informants to small, rural hospitals on implementing the electronic health record system.

    PubMed

    Craven, Catherine K; Sievert, MaryEllen C; Hicks, Lanis L; Alexander, Gregory L; Hearne, Leonard B; Holmes, John H

    2013-01-01

    The US government has allocated $30 billion dollars to implement Electronic Health Records (EHRs) in hospitals and provider practices through a policy called Meaningful Use. Small, rural hospitals, particularly those designated as Critical Access Hospitals (CAHs), comprising nearly a quarter of US hospitals, had not implemented EHRs before. Little is known on implementation in this setting. We interviewed a spectrum of 31 experts in the domain. The interviews were then analyzed qualitatively to ascertain the expert recommendations. Nineteen themes emerged. The pool of experts included staff from CAHs that had recently implemented EHRs. We were able to compare their answers with those of other experts and make recommendations for stakeholders. CAH peer experts focused less on issues such as physician buy-in, communication, and the EHR team. None of them indicated concern or focus on clinical decision support systems, leadership, or governance. They were especially concerned with system selection, technology, preparatory work and a need to know more about workflow and optimization. These differences were explained by the size and nature of these small hospitals.

  2. Scenarios, personas and user stories from design ethnography: Evidence-based design representations of communicable disease investigations

    PubMed Central

    Turner, Anne M; Reeder, Blaine; Ramey, Judith

    2014-01-01

    Purpose Despite years of effort and millions of dollars spent to create a unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of User Center Design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to communicable disease (CD) monitoring and control at a local health department, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. Methods We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health department communicable disease workflow. Informed by principles of design ethnography and user-centered design (UCD) we created persona, scenarios and user stories to accurately represent the user to system designers. Results We sought to convey to designers the key findings from ethnographic studies: 1) that public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed 2) health department efforts are focused on CD investigation and response rather than reporting and 3) current CD information systems must conform to PH workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Conclusions Through application of user centered design principles, we were able to create design representations that illustrate complex public health communicable disease workflow and key user characteristics to inform the design of CD information systems for public health. PMID:23618996

  3. Applications of process improvement techniques to improve workflow in abdominal imaging.

    PubMed

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  4. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  5. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  6. Scientific workflows as productivity tools for drug discovery.

    PubMed

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  7. Understanding critical barriers to implementing a clinical information system in a nursing home through the lens of a socio-technical perspective.

    PubMed

    Or, Calvin; Dohan, Michael; Tan, Joseph

    2014-09-01

    This paper addresses key barriers to implementing a clinical information system (CIS) in a Hong Kong nursing home setting, from a healthcare specific socio-technical perspective. Data was collected through field observations (n = 12) and semi-structured individual interviews (n = 18) of CIS stakeholders in a Hong Kong nursing home, and analyzed using the immersion/crystallization approach. Complex interactions relevant to our case were contextualized and interpreted within the perspective of the Sittig-Singh Healthcare Socio-Technical Framework (HSTF). Three broad clusters of implementation barriers from the eight HSTF dimensions were identified: (a) Infrastructure-based barriers, which relate to conflict between government regulations and system functional needs of users; lack of financial support; inconsistency between workflow, work policy, and procedures; and inadequacy of hardware-software infrastructural and technical support; (b) Process-based barriers, which relate to mismatch between the technology, existing work practice and workflow, and communication; low system speed, accessibility, and stability; deficient computer literacy; more experience in health care profession; clinical content inadequacy and unavailability; as well as poor system usefulness and user interface design; and (c) Outcome-based barriers, which relate to the lack of measurement and monitoring of system effectiveness. Two additional dimensions underlining the importance of the ability of a CIS to change are proposed to extend the Sittig-Singh HSTF. First, advocacy would promote the articulation and influence of changes in the system and subsequent outcomes by CIS stakeholders, and second, adaptability would ensure the ability of the system to adjust to emerging needs. The broad set of discovered implementation shortcomings expands prior research on why CIS can fail in nursing home settings. Moreover, our investigation offers a knowledge base and recommendations that can serve as a guide for future implementation strategies and policies in CIS initiatives.

  8. Integrating data from an online diabetes prevention program into an electronic health record and clinical workflow, a design phase usability study.

    PubMed

    Mishuris, Rebecca Grochow; Yoder, Jordan; Wilson, Dan; Mann, Devin

    2016-07-11

    Health information is increasingly being digitally stored and exchanged. The public is regularly collecting and storing health-related data on their own electronic devices and in the cloud. Diabetes prevention is an increasingly important preventive health measure, and diet and exercise are key components of this. Patients are turning to online programs to help them lose weight. Despite primary care physicians being important in patients' weight loss success, there is no exchange of information between the primary care provider (PCP) and these online weight loss programs. There is an emerging opportunity to integrate this data directly into the electronic health record (EHR), but little is known about what information to share or how to share it most effectively. This study aims to characterize the preferences of providers concerning the integration of externally generated lifestyle modification data into a primary care EHR workflow. We performed a qualitative study using two rounds of semi-structured interviews with primary care providers. We used an iterative design process involving primary care providers, health information technology software developers and health services researchers to develop the interface. Using grounded-theory thematic analysis 4 themes emerged from the interviews: 1) barriers to establishing healthy lifestyles, 2) features of a lifestyle modification program, 3) reporting of outcomes to the primary care provider, and 4) integration with primary care. These themes guided the rapid-cycle agile design process of an interface of data from an online diabetes prevention program into the primary care EHR workflow. The integration of external health-related data into the EHR must be embedded into the provider workflow in order to be useful to the provider and beneficial for the patient. Accomplishing this requires evaluation of that clinical workflow during software design. The development of this novel interface used rapid cycle iterative design, early involvement by providers, and usability testing methodology. This provides a framework for how to integrate external data into provider workflow in efficient and effective ways. There is now the potential to realize the importance of having this data available in the clinical setting for patient engagement and health outcomes.

  9. Design and implementation of GRID-based PACS in a hospital with multiple imaging departments

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Jin, Jin; Sun, Jianyong; Zhang, Jianguo

    2008-03-01

    Usually, there were multiple clinical departments providing imaging-enabled healthcare services in enterprise healthcare environment, such as radiology, oncology, pathology, and cardiology, the picture archiving and communication system (PACS) is now required to support not only radiology-based image display, workflow and data flow management, but also to have more specific expertise imaging processing and management tools for other departments providing imaging-guided diagnosis and therapy, and there were urgent demand to integrate the multiple PACSs together to provide patient-oriented imaging services for enterprise collaborative healthcare. In this paper, we give the design method and implementation strategy of developing grid-based PACS (Grid-PACS) for a hospital with multiple imaging departments or centers. The Grid-PACS functions as a middleware between the traditional PACS archiving servers and workstations or image viewing clients and provide DICOM image communication and WADO services to the end users. The images can be stored in distributed multiple archiving servers, but can be managed with central mode. The grid-based PACS has auto image backup and disaster recovery services and can provide best image retrieval path to the image requesters based on the optimal algorithms. The designed grid-based PACS has been implemented in Shanghai Huadong Hospital and been running for two years smoothly.

  10. Case study: design and implementation of training for scientists deploying to Ebola diagnostic field laboratories in Sierra Leone: October 2014 to February 2016

    PubMed Central

    Lewis, Suzanna M.; Lansley, Amber; Fraser, Sara; Shieber, Clare; Shah, Sonal; Semper, Amanda; Bailey, Daniel; Busuttil, Jason; Evans, Liz; Carroll, Miles W.; Silman, Nigel J.; Brooks, Tim; Shallcross, Jane A.

    2017-01-01

    As part of the UK response to the 2013–2016 Ebola virus disease (EVD) epidemic in West Africa, Public Health England (PHE) were tasked with establishing three field Ebola virus (EBOV) diagnostic laboratories in Sierra Leone by the UK Department for International Development (DFID). These provided diagnostic support to the Ebola Treatment Centre (ETC) facilities located in Kerry Town, Makeni and Port Loko. The Novel and Dangerous Pathogens (NADP) Training group at PHE, Porton Down, designed and implemented a pre-deployment Ebola diagnostic laboratory training programme for UK volunteer scientists being deployed to the PHE EVD laboratories. Here, we describe the training, workflow and capabilities of these field laboratories for use in response to disease epidemics and in epidemiological surveillance. We discuss the training outcomes, the laboratory outputs, lessons learned and the legacy value of the support provided. We hope this information will assist in the recruitment and training of staff for future responses and in the design and implementation of rapid deployment diagnostic field laboratories for future outbreaks of high consequence pathogens. This article is part of the themed issue ‘The 2013–2016 West African Ebola epidemic: data, decision-making and disease control’. PMID:28396470

  11. Scenarios, personas and user stories: user-centered evidence-based design representations of communicable disease investigations.

    PubMed

    Turner, Anne M; Reeder, Blaine; Ramey, Judith

    2013-08-01

    Despite years of effort and millions of dollars spent to create unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of user-centered design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to CD monitoring and control at a local health agency, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health agency CD workflow. Informed by principles of design ethnography and user-centered design we created persona, scenarios and user stories to accurately represent the user to system designers. We sought to convey to designers the key findings from ethnographic studies: (1) public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed, (2) health agency efforts are focused on CD investigation and response rather than reporting and (3) current CD information systems must conform to public health workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Through application of user-centered design principles, we were able to create design representations that illustrate complex public health communicable disease workflow and key user characteristics to inform the design of CD information systems for public health. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    PubMed

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.

    PubMed

    Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia

    2017-12-01

    Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    PubMed

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  15. Automation of lidar-based hydrologic feature extraction workflows using GIS

    NASA Astrophysics Data System (ADS)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  16. Open Targets: a platform for therapeutic target identification and validation

    PubMed Central

    Koscielny, Gautier; An, Peter; Carvalho-Silva, Denise; Cham, Jennifer A.; Fumis, Luca; Gasparyan, Rippa; Hasan, Samiul; Karamanis, Nikiforos; Maguire, Michael; Papa, Eliseo; Pierleoni, Andrea; Pignatelli, Miguel; Platt, Theo; Rowland, Francis; Wankar, Priyanka; Bento, A. Patrícia; Burdett, Tony; Fabregat, Antonio; Forbes, Simon; Gaulton, Anna; Gonzalez, Cristina Yenyxe; Hermjakob, Henning; Hersey, Anne; Jupe, Steven; Kafkas, Şenay; Keays, Maria; Leroy, Catherine; Lopez, Francisco-Javier; Magarinos, Maria Paula; Malone, James; McEntyre, Johanna; Munoz-Pomer Fuentes, Alfonso; O'Donovan, Claire; Papatheodorou, Irene; Parkinson, Helen; Palka, Barbara; Paschall, Justin; Petryszak, Robert; Pratanwanich, Naruemon; Sarntivijal, Sirarat; Saunders, Gary; Sidiropoulos, Konstantinos; Smith, Thomas; Sondka, Zbyslaw; Stegle, Oliver; Tang, Y. Amy; Turner, Edward; Vaughan, Brendan; Vrousgou, Olga; Watkins, Xavier; Martin, Maria-Jesus; Sanseau, Philippe; Vamathevan, Jessica; Birney, Ewan; Barrett, Jeffrey; Dunham, Ian

    2017-01-01

    We have designed and developed a data integration and visualization platform that provides evidence about the association of known and potential drug targets with diseases. The platform is designed to support identification and prioritization of biological targets for follow-up. Each drug target is linked to a disease using integrated genome-wide data from a broad range of data sources. The platform provides either a target-centric workflow to identify diseases that may be associated with a specific target, or a disease-centric workflow to identify targets that may be associated with a specific disease. Users can easily transition between these target- and disease-centric workflows. The Open Targets Validation Platform is accessible at https://www.targetvalidation.org. PMID:27899665

  17. Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.

  18. Redesign of a computerized clinical reminder for colorectal cancer screening: a human-computer interaction evaluation

    PubMed Central

    2011-01-01

    Background Based on barriers to the use of computerized clinical decision support (CDS) learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's) colorectal cancer (CRC) screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs) used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice. PMID:22126324

  19. Incorporating medication indications into the prescribing process.

    PubMed

    Kron, Kevin; Myers, Sara; Volk, Lynn; Nathan, Aaron; Neri, Pamela; Salazar, Alejandra; Amato, Mary G; Wright, Adam; Karmiy, Sam; McCord, Sarah; Seoane-Vazquez, Enrique; Eguale, Tewodros; Rodriguez-Monguio, Rosa; Bates, David W; Schiff, Gordon

    2018-04-19

    The incorporation of medication indications into the prescribing process to improve patient safety is discussed. Currently, most prescriptions lack a key piece of information needed for safe medication use: the patient-specific drug indication. Integrating indications could pave the way for safer prescribing in multiple ways, including avoiding look-alike/sound-alike errors, facilitating selection of drugs of choice, aiding in communication among the healthcare team, bolstering patient understanding and adherence, and organizing medication lists to facilitate medication reconciliation. Although strongly supported by pharmacists, multiple prior attempts to encourage prescribers to include the indication on prescriptions have not been successful. We convened 6 expert panels to consult high-level stakeholders on system design considerations and requirements necessary for building and implementing an indications-based computerized prescriber order-entry (CPOE) system. We summarize our findings from the 6 expert stakeholder panels, including rationale, literature findings, potential benefits, and challenges of incorporating indications into the prescribing process. Based on this stakeholder input, design requirements for a new CPOE interface and workflow have been identified. The emergence of universal electronic prescribing and content knowledge vendors has laid the groundwork for incorporating indications into the CPOE prescribing process. As medication prescribing moves in the direction of inclusion of the indication, it is imperative to design CPOE systems to efficiently and effectively incorporate indications into prescriber workflows and optimize ways this can best be accomplished. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  20. Informing the design of clinical decision support services for evaluation of children with minor blunt head trauma in the emergency department: a sociotechnical analysis.

    PubMed

    Sheehan, Barbara; Nigrovic, Lise E; Dayan, Peter S; Kuppermann, Nathan; Ballard, Dustin W; Alessandrini, Evaline; Bajaj, Lalit; Goldberg, Howard; Hoffman, Jeffrey; Offerman, Steven R; Mark, Dustin G; Swietlik, Marguerite; Tham, Eric; Tzimenatos, Leah; Vinson, David R; Jones, Grant S; Bakken, Suzanne

    2013-10-01

    Integration of clinical decision support services (CDSS) into electronic health records (EHRs) may be integral to widespread dissemination and use of clinical prediction rules in the emergency department (ED). However, the best way to design such services to maximize their usefulness in such a complex setting is poorly understood. We conducted a multi-site cross-sectional qualitative study whose aim was to describe the sociotechnical environment in the ED to inform the design of a CDSS intervention to implement the Pediatric Emergency Care Applied Research Network (PECARN) clinical prediction rules for children with minor blunt head trauma. Informed by a sociotechnical model consisting of eight dimensions, we conducted focus groups, individual interviews and workflow observations in 11 EDs, of which 5 were located in academic medical centers and 6 were in community hospitals. A total of 126 ED clinicians, information technology specialists, and administrators participated. We clustered data into 19 categories of sociotechnical factors through a process of thematic analysis and subsequently organized the categories into a sociotechnical matrix consisting of three high-level sociotechnical dimensions (workflow and communication, organizational factors, human factors) and three themes (interdisciplinary assessment processes, clinical practices related to prediction rules, EHR as a decision support tool). Design challenges that emerged from the analysis included the need to use structured data fields to support data capture and re-use while maintaining efficient care processes, supporting interdisciplinary communication, and facilitating family-clinician interaction for decision-making. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Interprofessional Health Team Communication About Hospital Discharge: An Implementation Science Evaluation Study.

    PubMed

    Bahr, Sarah J; Siclovan, Danielle M; Opper, Kristi; Beiler, Joseph; Bobay, Kathleen L; Weiss, Marianne E

    The Consolidated Framework for Implementation Research guided formative evaluation of the implementation of a redesigned interprofessional team rounding process. The purpose of the redesigned process was to improve health team communication about hospital discharge. Themes emerging from interviews of patients, nurses, and providers revealed the inherent value and positive characteristics of the new process, but also workflow, team hierarchy, and process challenges to successful implementation. The evaluation identified actionable recommendations for modifying the implementation process.

  2. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure

    PubMed Central

    2016-01-01

    Background Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. Objective The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. Methods We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. Results We identified 5 high-level macrocognitive processes affecting medication management—sensemaking, planning, coordination, monitoring, and decision making—and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Conclusions Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation. PMID:27733331

  3. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure.

    PubMed

    Mickelson, Robin S; Unertl, Kim M; Holden, Richard J

    2016-10-12

    Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. We identified 5 high-level macrocognitive processes affecting medication management-sensemaking, planning, coordination, monitoring, and decision making-and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation.

  4. MALDI-TOF-MS with PLS Modeling Enables Strain Typing of the Bacterial Plant Pathogen Xanthomonas axonopodis

    NASA Astrophysics Data System (ADS)

    Sindt, Nathan M.; Robison, Faith; Brick, Mark A.; Schwartz, Howard F.; Heuberger, Adam L.; Prenni, Jessica E.

    2018-02-01

    Matrix-assisted desorption/ionization time of flight mass spectrometry (MALDI-TOF-MS) is a fast and effective tool for microbial species identification. However, current approaches are limited to species-level identification even when genetic differences are known. Here, we present a novel workflow that applies the statistical method of partial least squares discriminant analysis (PLS-DA) to MALDI-TOF-MS protein fingerprint data of Xanthomonas axonopodis, an important bacterial plant pathogen of fruit and vegetable crops. Mass spectra of 32 X. axonopodis strains were used to create a mass spectral library and PLS-DA was employed to model the closely related strains. A robust workflow was designed to optimize the PLS-DA model by assessing the model performance over a range of signal-to-noise ratios (s/n) and mass filter (MF) thresholds. The optimized parameters were observed to be s/n = 3 and MF = 0.7. The model correctly classified 83% of spectra withheld from the model as a test set. A new decision rule was developed, termed the rolled-up Maximum Decision Rule (ruMDR), and this method improved identification rates to 92%. These results demonstrate that MALDI-TOF-MS protein fingerprints of bacterial isolates can be utilized to enable identification at the strain level. Furthermore, the open-source framework of this workflow allows for broad implementation across various instrument platforms as well as integration with alternative modeling and classification algorithms.

  5. DietPal: A Web-Based Dietary Menu-Generating and Management System

    PubMed Central

    Abdullah, Siti Norulhuda; Shahar, Suzana; Abdul-Hamid, Helmi; Khairudin, Nurkahirizan; Yusoff, Mohamed; Ghazali, Rafidah; Mohd-Yusoff, Nooraini; Shafii, Nik Shanita; Abdul-Manaf, Zaharah

    2004-01-01

    Background Attempts in current health care practice to make health care more accessible, effective, and efficient through the use of information technology could include implementation of computer-based dietary menu generation. While several of such systems already exist, their focus is mainly to assist healthy individuals calculate their calorie intake and to help monitor the selection of menus based upon a prespecified calorie value. Although these prove to be helpful in some ways, they are not suitable for monitoring, planning, and managing patients' dietary needs and requirements. This paper presents a Web-based application that simulates the process of menu suggestions according to a standard practice employed by dietitians. Objective To model the workflow of dietitians and to develop, based on this workflow, a Web-based system for dietary menu generation and management. The system is aimed to be used by dietitians or by medical professionals of health centers in rural areas where there are no designated qualified dietitians. Methods First, a user-needs study was conducted among dietitians in Malaysia. The first survey of 93 dietitians (with 52 responding) was an assessment of information needed for dietary management and evaluation of compliance towards a dietary regime. The second study consisted of ethnographic observation and semi-structured interviews with 14 dietitians in order to identify the workflow of a menu-suggestion process. We subsequently designed and developed a Web-based dietary menu generation and management system called DietPal. DietPal has the capability of automatically calculating the nutrient and calorie intake of each patient based on the dietary recall as well as generating suitable diet and menu plans according to the calorie and nutrient requirement of the patient, calculated from anthropometric measurements. The system also allows reusing stored or predefined menus for other patients with similar health and nutrient requirements. Results We modeled the workflow of menu-suggestion activity currently adhered to by dietitians in Malaysia. Based on this workflow, a Web-based system was developed. Initial post evaluation among 10 dietitians indicates that they are comfortable with the organization of the modules and information. Conclusions The system has the potential of enhancing the quality of services with the provision of standard and healthy menu plans and at the same time increasing outreach, particularly to rural areas. With its potential capability of optimizing the time spent by dietitians to plan suitable menus, more quality time could be spent delivering nutrition education to the patients. PMID:15111270

  6. DietPal: a Web-based dietary menu-generating and management system.

    PubMed

    Noah, Shahrul A; Abdullah, Siti Norulhuda; Shahar, Suzana; Abdul-Hamid, Helmi; Khairudin, Nurkahirizan; Yusoff, Mohamed; Ghazali, Rafidah; Mohd-Yusoff, Nooraini; Shafii, Nik Shanita; Abdul-Manaf, Zaharah

    2004-01-30

    Attempts in current health care practice to make health care more accessible, effective, and efficient through the use of information technology could include implementation of computer-based dietary menu generation. While several of such systems already exist, their focus is mainly to assist healthy individuals calculate their calorie intake and to help monitor the selection of menus based upon a prespecified calorie value. Although these prove to be helpful in some ways, they are not suitable for monitoring, planning, and managing patients' dietary needs and requirements. This paper presents a Web-based application that simulates the process of menu suggestions according to a standard practice employed by dietitians. To model the workflow of dietitians and to develop, based on this workflow, a Web-based system for dietary menu generation and management. The system is aimed to be used by dietitians or by medical professionals of health centers in rural areas where there are no designated qualified dietitians. First, a user-needs study was conducted among dietitians in Malaysia. The first survey of 93 dietitians (with 52 responding) was an assessment of information needed for dietary management and evaluation of compliance towards a dietary regime. The second study consisted of ethnographic observation and semi-structured interviews with 14 dietitians in order to identify the workflow of a menu-suggestion process. We subsequently designed and developed a Web-based dietary menu generation and management system called DietPal. DietPal has the capability of automatically calculating the nutrient and calorie intake of each patient based on the dietary recall as well as generating suitable diet and menu plans according to the calorie and nutrient requirement of the patient, calculated from anthropometric measurements. The system also allows reusing stored or predefined menus for other patients with similar health and nutrient requirements. We modeled the workflow of menu-suggestion activity currently adhered to by dietitians in Malaysia. Based on this workflow, a Web-based system was developed. Initial post evaluation among 10 dietitians indicates that they are comfortable with the organization of the modules and information. The system has the potential of enhancing the quality of services with the provision of standard and healthy menu plans and at the same time increasing outreach, particularly to rural areas. With its potential capability of optimizing the time spent by dietitians to plan suitable menus, more quality time could be spent delivering nutrition education to the patients.

  7. Medical information, communication, and archiving system (MICAS): Phase II integration and acceptance testing

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-modality integrated image management system that is seamlessly integrated with the Radiology Information System (RIS). This project was initiated in the summer of 1995 with the first phase being installed during the first half of 1997 and the second phase installed during the summer of 1998. Phase II enhancements include a permanent archive, automated workflow including modality worklist, study caches, NT diagnostic workstations with all components adhering to Digital Imaging and Communications in Medicine (DICOM) standards. This multi-vendor phased approach to PACS implementation is designed as an enterprise-wide PACS to provide images and reports throughout our healthcare network. MICAS demonstrates that aa multi-vendor open system phased approach to PACS is feasible, cost-effective, and has significant advantages over a single vendor implementation.

  8. Performing statistical analyses on quantitative data in Taverna workflows: an example using R and maxdBrowse to identify differentially-expressed genes from microarray data.

    PubMed

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-08-07

    There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data.

  9. Performing statistical analyses on quantitative data in Taverna workflows: An example using R and maxdBrowse to identify differentially-expressed genes from microarray data

    PubMed Central

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-01-01

    Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data. PMID:18687127

  10. Towards a Brokering Framework for Business Process Execution

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Bigagli, Lorenzo; Roncella, Roberto; Mazzetti, Paolo; Nativi, Stefano

    2013-04-01

    Advancing our knowledge of environmental phenomena and their interconnections requires an intensive use of environmental models. Due to the complexity of Earth system, the representation of complex environmental processes often requires the use of more than one model (often from different disciplines). The Group on Earth Observation (GEO) launched the Model Web initiative to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): (i) Open access, (ii) Minimal entry-barriers, (iii) Service-driven approach, and (iv) Scalability. This work proposes an architectural solution, based on the Brokering approach for multidisciplinary interoperability, aiming to contribute to the Model Web vision. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. We designed and prototyped a component called BP Broker. The high-level functionalities provided by the BP Broker are: • Discover the needed model implementations in an open, distributed and heterogeneous environment; • Check I/O consistency of BPs and provide suggestions for mismatches resolving: • Publish the EBP as a standard model resource for re-use. • Submit the compiled BP (EBP) to a WF-engine for execution. A BP Broker has the following features: • Support multiple abstract BP specifications; • Support encoding in multiple WF-engine languages. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15

  11. A framework for streamlining research workflow in neuroscience and psychology

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for a faster, more robust code development and collaboration for researchers. PMID:24478691

  12. A Foundation for Enterprise Imaging: HIMSS-SIIM Collaborative White Paper.

    PubMed

    Roth, Christopher J; Lannum, Louis M; Persons, Kenneth R

    2016-10-01

    Care providers today routinely obtain valuable clinical multimedia with mobile devices, scope cameras, ultrasound, and many other modalities at the point of care. Image capture and storage workflows may be heterogeneous across an enterprise, and as a result, they often are not well incorporated in the electronic health record. Enterprise Imaging refers to a set of strategies, initiatives, and workflows implemented across a healthcare enterprise to consistently and optimally capture, index, manage, store, distribute, view, exchange, and analyze all clinical imaging and multimedia content to enhance the electronic health record. This paper is intended to introduce Enterprise Imaging as an important initiative to clinical and informatics leadership, and outline its key elements of governance, strategy, infrastructure, common multimedia content, acquisition workflows, enterprise image viewers, and image exchange services.

  13. Spherical: an iterative workflow for assembling metagenomic datasets.

    PubMed

    Hitch, Thomas C A; Creevey, Christopher J

    2018-01-24

    The consensus emerging from the study of microbiomes is that they are far more complex than previously thought, requiring better assemblies and increasingly deeper sequencing. However, current metagenomic assembly techniques regularly fail to incorporate all, or even the majority in some cases, of the sequence information generated for many microbiomes, negating this effort. This can especially bias the information gathered and the perceived importance of the minor taxa in a microbiome. We propose a simple but effective approach, implemented in Python, to address this problem. Based on an iterative methodology, our workflow (called Spherical) carries out successive rounds of assemblies with the sequencing reads not yet utilised. This approach also allows the user to reduce the resources required for very large datasets, by assembling random subsets of the whole in a "divide and conquer" manner. We demonstrate the accuracy of Spherical using simulated data based on completely sequenced genomes and the effectiveness of the workflow at retrieving lost information for taxa in three published metagenomics studies of varying sizes. Our results show that Spherical increased the amount of reads utilized in the assembly by up to 109% compared to the base assembly. The additional contigs assembled by the Spherical workflow resulted in a significant (P < 0.05) changes in the predicted taxonomic profile of all datasets analysed. Spherical is implemented in Python 2.7 and freely available for use under the MIT license. Source code and documentation is hosted publically at: https://github.com/thh32/Spherical .

  14. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    NASA Astrophysics Data System (ADS)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).

  15. Design and implementation of disaster recovery and business continuity solution for radiology PACS.

    PubMed

    Mansoori, Bahar; Rosipko, Beverly; Erhard, Karen K; Sunshine, Jeffrey L

    2014-02-01

    In the digital era of radiology, picture archiving and communication system (PACS) has a pivotal role in retrieving and storing the images. Integration of PACS with all the health care information systems e.g., health information system, radiology information system, and electronic medical record has greatly improved access to patient data at anytime and anywhere throughout the entire enterprise. In such an integrated setting, seamless operation depends critically on maintaining data integrity and continuous access for all. Any failure in hardware or software could interrupt the workflow or data and consequently, would risk serious impact to patient care. Thus, any large-scale PACS now have an indispensable requirement to include deployment of a disaster recovery plan to ensure secure sources of data. This paper presents our experience with designing and implementing a disaster recovery and business continuity plan. The selected architecture with two servers in each site (local and disaster recovery (DR) site) provides four different scenarios to continue running and maintain end user service. The implemented DR at University Hospitals Health System now permits continuous access to the PACS application and its contained images for radiologists, other clinicians, and patients alike.

  16. An analysis of mobile whole blood collection labor efficiency.

    PubMed

    Rose, William N; Dayton, Paula J; Raife, Thomas J

    2011-07-01

    Labor efficiency is desirable in mobile blood collection. There are few published data on labor efficiency. The variability in the labor efficiency of mobile whole blood collections was analyzed. We determined to improve our labor efficiency using lean manufacturing principles. Workflow changes in mobile collections were implemented with the goal of minimizing labor expenditures. To measure success, data on labor efficiency measured by units/hour/full-time equivalent (FTE) were collected. The labor efficiency in a 6-month period before the implementation of changes, and in months 1 to 6 and 7 to 12 after implementation was analyzed and compared. Labor efficiency in the 6-month period preceding implementation was 1.06 ± 0.4 units collected/hour/FTE. In months 1 to 6, labor efficiency declined slightly to 0.92 ± 0.4 units collected/hour/FTE (p = 0.016 vs. preimplementation). In months 7 to 12, the mean labor efficiency returned to preimplementation levels of 1.09 ±0.4 units collected/hour/FTE. Regression analysis correlating labor efficiency with total units collected per drive revealed a strong correlation (R(2) = 0.48 for the aggregate data from all three periods), indicating that nearly half of labor efficiency was associated with drive size. The lean-based changes in workflow were subjectively favored by employees and donors. The labor efficiency of our mobile whole blood drives is strongly influenced by size. Larger drives are more efficient, with diminishing returns above 40 units collected. Lean-based workflow changes were positively received by employees and donors. © 2011 American Association of Blood Banks.

  17. Integrating Population Health Data on Violence Into the Emergency Department: A Feasibility and Implementation Study.

    PubMed

    Levas, Michael N; Hernandez-Meier, Jennifer L; Kohlbeck, Sara; Piotrowski, Nancy; Hargarten, Stephen

    Geocoded emergency department (ED) data have allowed for the development and evaluation of novel interventions for the prevention of violence in cities outside of the United States. First implemented in Cardiff, United Kingdom, collection of these data provides public health agencies, community organizations, and law enforcement with place-based information on assaults. The purpose of this study was to assess the feasibility of translating this model within the electronic medical record (EMR) in the United States. A new EMR module based on the Cardiff Model was developed and integrated into the existing ED EMR. Data were collected for all patients reporting an assaultive injury upon arrival to the ED. Emergency department nurses were subsequently recruited to participate in 2 surveys and a focus group to evaluate the implementation and to provide qualitative feedback to enhance integration. Nurses completed EMR questions in 98.2% of patients reporting to the ED over the study period. More than 90% of survey respondents were satisfied with their participation, and most felt that the questions were useful for clinical care (79/70%), were integrated well into workflow (89/90%), and were congruent with the ED and hospital goals and mission (93/98%). Focus group themes centered on ED culture, external factors, and internal workflow. It is feasible to implement place-based, assault-related injury-specific questions into the EMR with minimal disruption of workflow and triage times. Nurses, as key members of the ED team, are receptive to participating in the collection of population health data that may inform community violence prevention activities.

  18. MO-E-BRC-01: Online Adaptive MR-Guided RT: Workflow and Clinical Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashani, R.

    Online adaptive radiation therapy has the potential to ensure delivery of optimal treatment to the patient by accounting for anatomical and potentially functional changes that occur from one fraction to the next and over the course of treatment. While on-line adaptive RT (ART) has been a topic of many publications, discussions, and research, it has until very recently remained largely a concept and not a practical implementation. However, recent advances in on-table imaging, use of deformable image registration for contour generation and dose tracking, faster and more efficient plan optimization, as well as fast quality assurance method has enabled themore » implementation of ART in the clinic in the past couple of years. The introduction of these tools into routine clinical use requires many considerations and progressive knowledge to understand how processes that have historically taken hours/days to complete can now be done in less than 30 minutes. This session will discuss considerations to perform real time contouring, planning and patient specific QA, as well as a practical workflow and the required resources. Learning Objectives: To understand the difficulties, challenges and available technologies for online adaptive RT. To understand how to implement online adaptive therapy in a clinical environment and to understand the workflow and resources required. To understand the limitations and sources of uncertainty in the online adaptive process I have research funding from ViewRay Inc. and Philips Medical Systems.; R. Kashani, I have research funding from ViewRay Inc. and Philips Medical Systems.; X. Li, Research supported by Elekta Inc.« less

  19. Using lean principles to improve outpatient adult infusion clinic chemotherapy preparation turnaround times.

    PubMed

    Lamm, Matthew H; Eckel, Stephen; Daniels, Rowell; Amerine, Lindsey B

    2015-07-01

    The workflow and chemotherapy preparation turnaround times at an adult infusion clinic were evaluated to identify opportunities to optimize workflow and efficiency. A three-phase study using Lean Six Sigma methodology was conducted. In phase 1, chemotherapy turnaround times in the adult infusion clinic were examined one year after the interim goal of a 45-minute turnaround time was established. Phase 2 implemented various experiments including a five-day Kaizen event, using lean principles in an effort to decrease chemotherapy preparation turnaround times in a controlled setting. Phase 3 included the implementation of process-improvement strategies identified during the Kaizen event, coupled with a final refinement of operational processes. In phase 1, the mean turnaround time for all chemotherapy preparations decreased from 60 to 44 minutes, and a mean of 52 orders for adult outpatient chemotherapy infusions was received each day. After installing new processes, the mean turnaround time had improved to 37 minutes for each chemotherapy preparation in phase 2. In phase 3, the mean turnaround time decreased from 37 to 26 minutes. The overall mean turnaround time was reduced by 26 minutes, representing a 57% decrease in turnaround times in 19 months through the elimination of waste and the implementation of lean principles. This reduction was accomplished through increased efficiencies in the workplace, with no addition of human resources. Implementation of Lean Six Sigma principles improved workflow and efficiency at an adult infusion clinic and reduced the overall chemotherapy turnaround times from 60 to 26 minutes. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  20. Distributed Data Integration Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, T; Ludaescher, B; Vouk, M

    The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This can result in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. A related issue is keeping up with current trends in information technology often taxes the end-user's expertise and time. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and technologies, usemore » them almost exclusively, and develop a resistance to innovations that can enhance their productivity. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information and the latest technologies. In order to address this problem we are developing a end-user centric, domain-sensitive workflow-based infrastructure, shown in Figure 1, that will allow scientists to design complex scientific workflows that reflect the data manipulation required to perform their research without an undue burden. We are taking a three-tiered approach to designing this infrastructure utilizing (1) abstract workflow definition, construction, and automatic deployment, (2) complex agent-based workflow execution and (3) automatic wrapper generation. In order to construct a workflow, the scientist defines an abstract workflow (AWF) in terminology (semantics and context) that is familiar to him/her. This AWF includes all of the data transformations, selections, and analyses required by the scientist, but does not necessarily specify particular data sources. This abstract workflow is then compiled into an executable workflow (EWF, in our case XPDL) that is then evaluated and executed by the workflow engine. This EWF contains references to specific data source and interfaces capable of performing the desired actions. In order to provide access to the largest number of resources possible, our lowest level utilizes automatic wrapper generation techniques to create information and data wrappers capable of interacting with the complex interfaces typical in scientific analysis. The remainder of this document outlines our work in these three areas, the impact our work has made, and our plans for the future.« less

  1. Structuring clinical workflows for diabetes care: an overview of the OntoHealth approach.

    PubMed

    Schweitzer, M; Lasierra, N; Oberbichler, S; Toma, I; Fensel, A; Hoerbst, A

    2014-01-01

    Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view.

  2. Structuring Clinical Workflows for Diabetes Care

    PubMed Central

    Lasierra, N.; Oberbichler, S.; Toma, I.; Fensel, A.; Hoerbst, A.

    2014-01-01

    Summary Background Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. Objectives The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. Methods A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. Results This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. Conclusions The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view. PMID:25024765

  3. Optimizing insulin pump therapy: the potential advantages of using a structured diabetes management program.

    PubMed

    Lange, Karin; Ziegler, Ralph; Neu, Andreas; Reinehr, Thomas; Daab, Iris; Walz, Marion; Maraun, Michael; Schnell, Oliver; Kulzer, Bernhard; Reichel, Andreas; Heinemann, Lutz; Parkin, Christopher G; Haak, Thomas

    2015-03-01

    Use of continuous subcutaneous insulin infusion (CSII) therapy improves glycemic control, reduces hypoglycemia and increases treatment satisfaction in individuals with diabetes. As a number of patient- and clinician-related factors can hinder the effectiveness and optimal usage of CSII therapy, new approaches are needed to address these obstacles. Ceriello and colleagues recently proposed a model of care that incorporates the collaborative use of structured SMBG into a formal approach to personalized diabetes management within all diabetes populations. We adapted this model for use in CSII-treated patients in order to enable the implementation of a workflow structure that enhances patient-physician communication and supports patients' diabetes self-management skills. We recognize that time constraints and current reimbursement policies pose significant challenges to healthcare providers integrating the Personalised Diabetes Management (PDM) process into clinical practice. We believe, however, that the time invested in modifying practice workflow and learning to apply the various steps of the PDM process will be offset by improved workflow and more effective patient consultations. This article describes how to implement PDM into clinical practice as a systematic, standardized process that can optimize CSII therapy.

  4. An automated workflow for patient-specific quality control of contour propagation

    NASA Astrophysics Data System (ADS)

    Beasley, William J.; McWilliam, Alan; Slevin, Nicholas J.; Mackay, Ranald I.; van Herk, Marcel

    2016-12-01

    Contour propagation is an essential component of adaptive radiotherapy, but current contour propagation algorithms are not yet sufficiently accurate to be used without manual supervision. Manual review of propagated contours is time-consuming, making routine implementation of real-time adaptive radiotherapy unrealistic. Automated methods of monitoring the performance of contour propagation algorithms are therefore required. We have developed an automated workflow for patient-specific quality control of contour propagation and validated it on a cohort of head and neck patients, on which parotids were outlined by two observers. Two types of error were simulated—mislabelling of contours and introducing noise in the scans before propagation. The ability of the workflow to correctly predict the occurrence of errors was tested, taking both sets of observer contours as ground truth, using receiver operator characteristic analysis. The area under the curve was 0.90 and 0.85 for the observers, indicating good ability to predict the occurrence of errors. This tool could potentially be used to identify propagated contours that are likely to be incorrect, acting as a flag for manual review of these contours. This would make contour propagation more efficient, facilitating the routine implementation of adaptive radiotherapy.

  5. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  6. The EU-project United4Health: User-Centred Design and Evaluation of a Collaborative Information System for a Norwegian Telehealth Service.

    PubMed

    Smaradottir, Berglind; Gerdes, Martin; Martinez, Santiago; Fensli, Rune

    2015-01-01

    This study presents the user-centred design and evaluation process of a Collaborative Information System (CIS), developed for a new telehealth service for remote monitoring of chronic obstructive pulmonary disease patients after hospital discharge. The CIS was designed based on the information gathered in a workshop, where target end-users described the context of use, a telehealth workflow and their preferred ways of interaction with the solution. Evaluation of the iterative refinements were made through user tests, semi-structured interviews and a questionnaire. A field trial reported results on the ease of use and user satisfaction during the interaction with the fully developed system. The implemented CIS was successfully deployed within the secured Norwegian Health Network. The research was a result of cooperation between international partners within the EU FP7 project United4Health.

  7. Shiny FHIR: An Integrated Framework Leveraging Shiny R and HL7 FHIR to Empower Standards-Based Clinical Data Applications.

    PubMed

    Hong, Na; Prodduturi, Naresh; Wang, Chen; Jiang, Guoqian

    2017-01-01

    In this study, we describe our efforts in building a clinical statistics and analysis application platform using an emerging clinical data standard, HL7 FHIR, and an open source web application framework, Shiny. We designed two primary workflows that integrate a series of R packages to enable both patient-centered and cohort-based interactive analyses. We leveraged Shiny with R to develop interactive interfaces on FHIR-based data and used ovarian cancer study datasets as a use case to implement a prototype. Specifically, we implemented patient index, patient-centered data report and analysis, and cohort analysis. The evaluation of our study was performed by testing the adaptability of the framework on two public FHIR servers. We identify common research requirements and current outstanding issues, and discuss future enhancement work of the current studies. Overall, our study demonstrated that it is feasible to use Shiny for implementing interactive analysis on FHIR-based standardized clinical data.

  8. Asterism: an integrated, complete, and open-source approach for running seismologist continuous data-intensive analysis on heterogeneous systems

    NASA Astrophysics Data System (ADS)

    Ferreira da Silva, R.; Filgueira, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus. Asterism is freely available online at http://github.com/dispel4py/pegasus_dispel4py.

  9. Development of electronic medical record charting for hospital-based transfusion and apheresis medicine services: Early adoption perspectives

    PubMed Central

    Levy, Rebecca; Pantanowitz, Liron; Cloutier, Darlene; Provencher, Jean; McGirr, Joan; Stebbins, Jennifer; Cronin, Suzanne; Wherry, Josh; Fenton, Joseph; Donelan, Eileen; Johari, Vandita; Andrzejewski, Chester

    2010-01-01

    Background: Electronic medical records (EMRs) provide universal access to health care information across multidisciplinary lines. In pathology departments, transfusion and apheresis medicine services (TAMS) involved in direct patient care activities produce data and documentation that typically do not enter the EMR. Taking advantage of our institution's initiative for implementation of a paperless medical record, our TAMS division set out to develop an electronic charting (e-charting) strategy within the EMR. Methods: A focus group of our hospital's transfusion committee consisting of transfusion medicine specialists, pathologists, residents, nurses, hemapheresis specialists, and information technologists was constituted and charged with the project. The group met periodically to implement e-charting TAMS workflow and produced electronic documents within the EMR (Cerner Millenium) for various service line functions. Results: The interdisciplinary working group developed and implemented electronic versions of various paper-based clinical documentation used by these services. All electronic notes collectively gather and reside within a unique Transfusion Medicine Folder tab in the EMR, available to staff with access to patient charts. E-charting eliminated illegible handwritten notes, resulted in more consistent clinical documentation among staff, and provided greater realered. However, minor updates and corrections to documents as well as select work re-designs were required for optimal use of e-charting-time review/access of hemotherapy practices. No major impediments to workflow or inefficiencies have been encount by these services. Conclusion: Documentation of pathology subspecialty activities such as TAMS can be successfully incorporated into the EMR. E-charting by staff enhances communication and helps promote standardized documentation of patient care within and across service lines. Well-constructed electronic documents in the EMR may also enhance data mining, quality improvement, and biovigilance monitoring activities. PMID:20805955

  10. A Brokering Solution for Business Process Execution

    NASA Astrophysics Data System (ADS)

    Santoro, M.; Bigagli, L.; Roncella, R.; Mazzetti, P.; Nativi, S.

    2012-12-01

    Predicting the climate change impact on biodiversity and ecosystems, advancing our knowledge of environmental phenomena interconnection, assessing the validity of simulations and other key challenges of Earth Sciences require intensive use of environmental modeling. The complexity of Earth system requires the use of more than one model (often from different disciplines) to represent complex processes. The identification of appropriate mechanisms for reuse, chaining and composition of environmental models is considered a key enabler for an effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. The Group on Earth Observation (GEO) Model Web initiative aims to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): 1. Open access 2. Minimal entry-barriers 3. Service-driven approach 4. Scalability In this work we propose an architectural solution aiming to contribute to the Model Web vision. This solution applies the Brokering approach for facilitiating complex multidisciplinary interoperability. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. We designed and prototyped a component called BP Broker that is able to: (i) read an abstract BP, (ii) "compile" the abstract BP into an executable one (eBP) - in this phase the BP Broker might also provide recommendations for incomplete BPs and parameter mismatch resolution - and (iii) finally execute the eBP using a Workflow engine. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15

  11. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    PubMed

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  12. The MaxQuant computational platform for mass spectrometry-based shotgun proteomics.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Cox, Juergen

    2016-12-01

    MaxQuant is one of the most frequently used platforms for mass-spectrometry (MS)-based proteomics data analysis. Since its first release in 2008, it has grown substantially in functionality and can be used in conjunction with more MS platforms. Here we present an updated protocol covering the most important basic computational workflows, including those designed for quantitative label-free proteomics, MS1-level labeling and isobaric labeling techniques. This protocol presents a complete description of the parameters used in MaxQuant, as well as of the configuration options of its integrated search engine, Andromeda. This protocol update describes an adaptation of an existing protocol that substantially modifies the technique. Important concepts of shotgun proteomics and their implementation in MaxQuant are briefly reviewed, including different quantification strategies and the control of false-discovery rates (FDRs), as well as the analysis of post-translational modifications (PTMs). The MaxQuant output tables, which contain information about quantification of proteins and PTMs, are explained in detail. Furthermore, we provide a short version of the workflow that is applicable to data sets with simple and standard experimental designs. The MaxQuant algorithms are efficiently parallelized on multiple processors and scale well from desktop computers to servers with many cores. The software is written in C# and is freely available at http://www.maxquant.org.

  13. Design and validation of general biology learning program based on scientific inquiry skills

    NASA Astrophysics Data System (ADS)

    Cahyani, R.; Mardiana, D.; Noviantoro, N.

    2018-03-01

    Scientific inquiry is highly recommended to teach science. The reality in the schools and colleges is that many educators still have not implemented inquiry learning because of their lack of understanding. The study aims to1) analyze students’ difficulties in learning General Biology, 2) design General Biology learning program based on multimedia-assisted scientific inquiry learning, and 3) validate the proposed design. The method used was Research and Development. The subjects of the study were 27 pre-service students of general elementary school/Islamic elementary schools. The workflow of program design includes identifying learning difficulties of General Biology, designing course programs, and designing instruments and assessment rubrics. The program design is made for four lecture sessions. Validation of all learning tools were performed by expert judge. The results showed that: 1) there are some problems identified in General Biology lectures; 2) the designed products include learning programs, multimedia characteristics, worksheet characteristics, and, scientific attitudes; and 3) expert validation shows that all program designs are valid and can be used with minor revisions. The first section in your paper.

  14. Implementation of an Anesthesia Information Management System in an Ambulatory Surgery Center.

    PubMed

    Mudumbai, Seshadri C

    2016-01-01

    Anesthesia information management systems (AIMS) are increasingly being implemented throughout the United States. However, little information exists on the implementation process for AIMS within ambulatory surgery centers (ASC). The objectives of this descriptive study are to document: 1) the phases of implementation of an AIMS at an ASC; and 2) lessons learnt from a socio-technical perspective. The ASC, within the Veterans Health Administration (VHA), has hosted an AIMS since 2008. As a quality improvement effort, we implemented a new version of the AIMS. This new version involved fundamental software changes to enhance clinical care such as real-time importing of laboratory data and total hardware exchange. The pre-implementation phase involved coordinated preparation over six months between multiple informatics teams along with local leadership. During this time, we conducted component, integration, and validation testing to ensure correct data flow from medical devices to AIMS and centralized databases. The implementation phase occurred in September 2014 over three days and was successful. Over the next several months, during post-implementation phase, we addressed residual items like latency of the application. Important lessons learnt from the implementation included the utility of partnering early with executive leadership; ensuring end user acceptance of new clinical workflow; continuous testing of data flow; use of a staged rollout; and providing additional personnel throughout implementation. Implementation of an AIMS at an ASC can utilize methods developed for large hospitals. However, issues unique to an ASC such as limited number of support personnel and distinctive workflows must be considered.

  15. Large field of view quantitative phase imaging of induced pluripotent stem cells and optical pathlength reference materials

    NASA Astrophysics Data System (ADS)

    Kwee, Edward; Peterson, Alexander; Stinson, Jeffrey; Halter, Michael; Yu, Liya; Majurski, Michael; Chalfoun, Joe; Bajcsy, Peter; Elliott, John

    2018-02-01

    Induced pluripotent stem cells (iPSCs) are reprogrammed cells that can have heterogeneous biological potential. Quality assurance metrics of reprogrammed iPSCs will be critical to ensure reliable use in cell therapies and personalized diagnostic tests. We present a quantitative phase imaging (QPI) workflow which includes acquisition, processing, and stitching multiple adjacent image tiles across a large field of view (LFOV) of a culture vessel. Low magnification image tiles (10x) were acquired with a Phasics SID4BIO camera on a Zeiss microscope. iPSC cultures were maintained using a custom stage incubator on an automated stage. We implement an image acquisition strategy that compensates for non-flat illumination wavefronts to enable imaging of an entire well plate, including the meniscus region normally obscured in Zernike phase contrast imaging. Polynomial fitting and background mode correction was implemented to enable comparability and stitching between multiple tiles. LFOV imaging of reference materials indicated that image acquisition and processing strategies did not affect quantitative phase measurements across the LFOV. Analysis of iPSC colony images demonstrated mass doubling time was significantly different than area doubling time. These measurements were benchmarked with prototype microsphere beads and etched-glass gratings with specified spatial dimensions designed to be QPI reference materials with optical pathlength shifts suitable for cell microscopy. This QPI workflow and the use of reference materials can provide non-destructive traceable imaging method for novel iPSC heterogeneity characterization.

  16. Autonomous driving in NMR.

    PubMed

    Perez, Manuel

    2017-01-01

    The automatic analysis of NMR data has been a much-desired endeavour for the last six decades, as it is the case with any other analytical technique. This need for automation has only grown as advances in hardware; pulse sequences and automation have opened new research areas to NMR and increased the throughput of data. Full automatic analysis is a worthy, albeit hard, challenge, but in a world of artificial intelligence, instant communication and big data, it seems that this particular fight is happening with only one technique at a time (let this be NMR, MS, IR, UV or any other), when the reality of most laboratories is that there are several types of analytical instrumentation present. Data aggregation, verification and elucidation by using complementary techniques (e.g. MS and NMR) is a desirable outcome to pursue, although a time-consuming one if performed manually; hence, the use of automation to perform the heavy lifting for users is required to make the approach attractive for scientists. Many of the decisions and workflows that could be implemented under automation will depend on the two-way communication with databases that understand analytical data, because it is desirable not only to query these databases but also to grow them in as much of an automatic manner as possible. How these databases are designed, set up and the data inside classified will determine what workflows can be implemented. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. MEvoLib v1.0: the first molecular evolution library for Python.

    PubMed

    Álvarez-Jarreta, Jorge; Ruiz-Pesini, Eduardo

    2016-10-28

    Molecular evolution studies involve many different hard computational problems solved, in most cases, with heuristic algorithms that provide a nearly optimal solution. Hence, diverse software tools exist for the different stages involved in a molecular evolution workflow. We present MEvoLib, the first molecular evolution library for Python, providing a framework to work with different tools and methods involved in the common tasks of molecular evolution workflows. In contrast with already existing bioinformatics libraries, MEvoLib is focused on the stages involved in molecular evolution studies, enclosing the set of tools with a common purpose in a single high-level interface with fast access to their frequent parameterizations. The gene clustering from partial or complete sequences has been improved with a new method that integrates accessible external information (e.g. GenBank's features data). Moreover, MEvoLib adjusts the fetching process from NCBI databases to optimize the download bandwidth usage. In addition, it has been implemented using parallelization techniques to cope with even large-case scenarios. MEvoLib is the first library for Python designed to facilitate molecular evolution researches both for expert and novel users. Its unique interface for each common task comprises several tools with their most used parameterizations. It has also included a method to take advantage of biological knowledge to improve the gene partition of sequence datasets. Additionally, its implementation incorporates parallelization techniques to enhance computational costs when handling very large input datasets.

  18. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    PubMed

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.

  19. Building Geophysics Talent and Opportunity in Africa: Experience from the AfricaArray/Wits Geophysics Field School

    NASA Astrophysics Data System (ADS)

    Webb, S. J.; Manzi, M.; Scheiber-Enslin, S. E.; Durrheim, R. J.; Jones, M. Q. W.; Nyblade, A.

    2015-12-01

    There are many challenges faced by geophysics students and academic staff in Africa that make it difficult to develop effective field and research programs. Challenges to conducting field work that have been identified, and that can be tackled are: lack of training on geophysical equipment and lack of exposure to field program design and implementation. To address these challenges, the AfricaArray/Wits Geophysics field school is designed to expose participants to a wide variety of geophysical instruments and the entire workflow of a geophysical project. The AA field school was initially developed for the geophysics students at the University of the Witwatersrand. However, by increasing the number of participants, we are able to make more effective use of a large pool of equipment, while addressing challenging geophysical problems at a remote field site. These additional participants are selected partially based on the likely hood of being able start a field school at their home institution. A good candidate would have access to geophysical equipment, but may not have knowledge of how to use it or how to effectively design surveys. These are frequently junior staff members or graduate students in leadership roles. The three week program introduces participants to the full geophysical field workflow. The first week is spent designing a geophysical survey, including determining the cost. The second week is spent collecting data to address a real geophysical challenge, such as determining overburden thickness, loss of ground features due to dykes in a mine, or finding water. The third week is spent interpreting and integrating the various data sets culminating in a final presentation. Participants are given all lecture material and much of the software is open access; this is done to encourage using the material at the home institution. One innovation has been to use graduate students as instructors, thus building a pool of talent that has developed the logistic and training skills necessary to implement field programs. Several geophysics field schools are being developed in Madagascar, Zimbabwe, Nigeria, Kenya, Uganda, Tanzania and Cameroon. We hope to enable some of our graduate students to help with these budding programs.

  20. PyGOLD: a python based API for docking based virtual screening workflow generation.

    PubMed

    Patel, Hitesh; Brinkjost, Tobias; Koch, Oliver

    2017-08-15

    Molecular docking is one of the successful approaches in structure based discovery and development of bioactive molecules in chemical biology and medicinal chemistry. Due to the huge amount of computational time that is still required, docking is often the last step in a virtual screening approach. Such screenings are set as workflows spanned over many steps, each aiming at different filtering task. These workflows can be automatized in large parts using python based toolkits except for docking using the docking software GOLD. However, within an automated virtual screening workflow it is not feasible to use the GUI in between every step to change the GOLD configuration file. Thus, a python module called PyGOLD was developed, to parse, edit and write the GOLD configuration file and to automate docking based virtual screening workflows. The latest version of PyGOLD, its documentation and example scripts are available at: http://www.ccb.tu-dortmund.de/koch or http://www.agkoch.de. PyGOLD is implemented in Python and can be imported as a standard python module without any further dependencies. oliver.koch@agkoch.de, oliver.koch@tu-dortmund.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. Evidence-based practice implementation in community mental health settings: the relative importance of key domains of implementation activity.

    PubMed

    Torrey, William C; Bond, Gary R; McHugo, Gregory J; Swain, Karin

    2012-09-01

    Implementation research has examined practice prioritization, implementation leadership, workforce development, workflow re-engineering, and practice reinforcement, but not addressed their relative importance as implementation drivers. This study investigated domains of implementation activities and correlated them to implementation success during a large national evidence-based practice implementation project. Implementation success was correlated with active leadership strategically devoted to redesigning the flow of work and reinforcing implementation through measurement and feedback. Relative attention to workforce development was negatively correlated with implementation. Active leaders should focus on redesigning the flow of work to support the implementation and on reinforcing program improvements.

  2. Design and implementation of a web-based patient portal linked to an ambulatory care electronic health record: patient gateway for diabetes collaborative care.

    PubMed

    Grant, Richard W; Wald, Jonathan S; Poon, Eric G; Schnipper, Jeffrey L; Gandhi, Tejal K; Volk, Lynn A; Middleton, Blackford

    2006-10-01

    Despite the availability of expert guidelines and widespread diabetes quality improvement efforts, care of patients with diabetes remains suboptimal. Two key barriers to care that may be amenable to informatics-based interventions include (1) lack of patient engagement with therapeutic care plans and (2) lack of medication adjustment by physicians ("clinical inertia") during clinical encounters. The authors describe the conceptual framework, design, implementation, and analysis plan for a diabetes patient web-portal linked directly to the electronic health record (EHR) of a large academic medical center via secure Internet access designed to overcome barriers to effective diabetes care. Partners HealthCare System (Boston, MA), a multi-hospital health care network comprising several thousand physicians caring for over 1 million individual patients, has developed a comprehensive patient web-portal called Patient Gateway that allows patients to interact directly with their EHR via secure Internet access. Using this portal, a specific diabetes interface was designed to maximize patient engagement by importing the patient's current clinical data in an educational format, providing patient-tailored decision support, and enabling the patient to author a "Diabetes Care Plan." The physician view of the patient's Diabetes Care Plan was designed to be concise and to fit into typical EHR clinical workflow. We successfully designed and implemented a Diabetes Patient portal that allows direct interaction with our system's EHR. We are assessing the impact of this advanced informatics tool for collaborative diabetes care in a clinic-randomized controlled trial among 14 primary care practices within our integrated health care system.

  3. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  4. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  5. A service oriented approach for guidelines-based clinical decision support using BPMN.

    PubMed

    Rodriguez-Loya, Salvador; Aziz, Ayesha; Chatwin, Chris

    2014-01-01

    Evidence-based medical practice requires that clinical guidelines need to be documented in such a way that they represent a clinical workflow in its most accessible form. In order to optimize clinical processes to improve clinical outcomes, we propose a Service Oriented Architecture (SOA) based approach for implementing clinical guidelines that can be accessed from an Electronic Health Record (EHR) application with a Web Services enabled communication mechanism with the Enterprise Service Bus. We have used Business Process Modelling Notation (BPMN) for modelling and presenting the clinical pathway in the form of a workflow. The aim of this study is to produce spontaneous alerts in the healthcare workflow in the diagnosis of Chronic Obstructive Pulmonary Disease (COPD). The use of BPMN as a tool to automate clinical guidelines has not been previously employed for providing Clinical Decision Support (CDS).

  6. The provider perspective: investigating the effect of the Electronic Patient-Reported Outcome (ePRO) mobile application and portal on primary care provider workflow.

    PubMed

    Hans, Parminder K; Gray, Carolyn Steele; Gill, Ashlinder; Tiessen, James

    2018-03-01

    Aim This qualitative study investigates how the Electronic Patient-Reported Outcome (ePRO) mobile application and portal system, designed to capture patient-reported measures to support self-management, affected primary care provider workflows. The Canadian health system is facing an ageing population that is living with chronic disease. Disruptive innovations like mobile health technologies can help to support health system transformation needed to better meet the multifaceted needs of the complex care patient. However, there are challenges with implementing these technologies in primary care settings, in particular the effect on primary care provider workflows. Over a six-week period interdisciplinary primary care providers (n=6) and their complex care patients (n=12), used the ePRO mobile application and portal to collaboratively goal-set, manage care plans, and support self-management using patient-reported measures. Secondary thematic analysis of focus groups, training sessions, and issue tracker reports captured user experiences at a Toronto area Family Health Team from October 2014 to January 2015. Findings Key issues raised by providers included: liability concerns associated with remote monitoring, increased documentation activities due to a lack of interoperability between the app and the electronic patient record, increased provider anxiety with regard to the potential for the app to disrupt and infringe upon appointment time, and increased demands for patient engagement. Primary care providers reported the app helped to focus care plans and to begin a collaborative conversation on goal-setting. However, throughout our investigation we found a high level of provider resistance evidenced by consistent attempts to shift the app towards fitting with existing workflows rather than adapting much of their behaviour. As health systems seek innovative and disruptive models to better serve this complex patient population, provider change resistance will need to be addressed. New models and technologies cannot be disruptive in an environment that is resisting change.

  7. Canine neuroanatomy: Development of a 3D reconstruction and interactive application for undergraduate veterinary education

    PubMed Central

    Raffan, Hazel; Guevar, Julien; Poyade, Matthieu; Rea, Paul M.

    2017-01-01

    Current methods used to communicate and present the complex arrangement of vasculature related to the brain and spinal cord is limited in undergraduate veterinary neuroanatomy training. Traditionally it is taught with 2-dimensional (2D) diagrams, photographs and medical imaging scans which show a fixed viewpoint. 2D representations of 3-dimensional (3D) objects however lead to loss of spatial information, which can present problems when translating this to the patient. Computer-assisted learning packages with interactive 3D anatomical models have become established in medical training, yet equivalent resources are scarce in veterinary education. For this reason, we set out to develop a workflow methodology creating an interactive model depicting the vasculature of the canine brain that could be used in undergraduate education. Using MR images of a dog and several commonly available software programs, we set out to show how combining image editing, segmentation and surface generation, 3D modeling and texturing can result in the creation of a fully interactive application for veterinary training. In addition to clearly identifying a workflow methodology for the creation of this dataset, we have also demonstrated how an interactive tutorial and self-assessment tool can be incorporated into this. In conclusion, we present a workflow which has been successful in developing a 3D reconstruction of the canine brain and associated vasculature through segmentation, surface generation and post-processing of readily available medical imaging data. The reconstructed model was implemented into an interactive application for veterinary education that has been designed to target the problems associated with learning neuroanatomy, primarily the inability to visualise complex spatial arrangements from 2D resources. The lack of similar resources in this field suggests this workflow is original within a veterinary context. There is great potential to explore this method, and introduce a new dimension into veterinary education and training. PMID:28192461

  8. Imaging industry expectations for compressed sensing in MRI

    NASA Astrophysics Data System (ADS)

    King, Kevin F.; Kanwischer, Adriana; Peters, Rob

    2015-09-01

    Compressed sensing requires compressible data, incoherent acquisition and a nonlinear reconstruction algorithm to force creation of a compressible image consistent with the acquired data. MRI images are compressible using various transforms (commonly total variation or wavelets). Incoherent acquisition of MRI data by appropriate selection of pseudo-random or non-Cartesian locations in k-space is straightforward. Increasingly, commercial scanners are sold with enough computing power to enable iterative reconstruction in reasonable times. Therefore integration of compressed sensing into commercial MRI products and clinical practice is beginning. MRI frequently requires the tradeoff of spatial resolution, temporal resolution and volume of spatial coverage to obtain reasonable scan times. Compressed sensing improves scan efficiency and reduces the need for this tradeoff. Benefits to the user will include shorter scans, greater patient comfort, better image quality, more contrast types per patient slot, the enabling of previously impractical applications, and higher throughput. Challenges to vendors include deciding which applications to prioritize, guaranteeing diagnostic image quality, maintaining acceptable usability and workflow, and acquisition and reconstruction algorithm details. Application choice depends on which customer needs the vendor wants to address. The changing healthcare environment is putting cost and productivity pressure on healthcare providers. The improved scan efficiency of compressed sensing can help alleviate some of this pressure. Image quality is strongly influenced by image compressibility and acceleration factor, which must be appropriately limited. Usability and workflow concerns include reconstruction time and user interface friendliness and response. Reconstruction times are limited to about one minute for acceptable workflow. The user interface should be designed to optimize workflow and minimize additional customer training. Algorithm concerns include the decision of which algorithms to implement as well as the problem of optimal setting of adjustable parameters. It will take imaging vendors several years to work through these challenges and provide solutions for a wide range of applications.

  9. Integrating text mining into the MGI biocuration workflow

    PubMed Central

    Dowell, K.G.; McAndrews-Hill, M.S.; Hill, D.P.; Drabkin, H.J.; Blake, J.A.

    2009-01-01

    A major challenge for functional and comparative genomics resource development is the extraction of data from the biomedical literature. Although text mining for biological data is an active research field, few applications have been integrated into production literature curation systems such as those of the model organism databases (MODs). Not only are most available biological natural language (bioNLP) and information retrieval and extraction solutions difficult to adapt to existing MOD curation workflows, but many also have high error rates or are unable to process documents available in those formats preferred by scientific journals. In September 2008, Mouse Genome Informatics (MGI) at The Jackson Laboratory initiated a search for dictionary-based text mining tools that we could integrate into our biocuration workflow. MGI has rigorous document triage and annotation procedures designed to identify appropriate articles about mouse genetics and genome biology. We currently screen ∼1000 journal articles a month for Gene Ontology terms, gene mapping, gene expression, phenotype data and other key biological information. Although we do not foresee that curation tasks will ever be fully automated, we are eager to implement named entity recognition (NER) tools for gene tagging that can help streamline our curation workflow and simplify gene indexing tasks within the MGI system. Gene indexing is an MGI-specific curation function that involves identifying which mouse genes are being studied in an article, then associating the appropriate gene symbols with the article reference number in the MGI database. Here, we discuss our search process, performance metrics and success criteria, and how we identified a short list of potential text mining tools for further evaluation. We provide an overview of our pilot projects with NCBO's Open Biomedical Annotator and Fraunhofer SCAI's ProMiner. In doing so, we prove the potential for the further incorporation of semi-automated processes into the curation of the biomedical literature. PMID:20157492

  10. Integrating text mining into the MGI biocuration workflow.

    PubMed

    Dowell, K G; McAndrews-Hill, M S; Hill, D P; Drabkin, H J; Blake, J A

    2009-01-01

    A major challenge for functional and comparative genomics resource development is the extraction of data from the biomedical literature. Although text mining for biological data is an active research field, few applications have been integrated into production literature curation systems such as those of the model organism databases (MODs). Not only are most available biological natural language (bioNLP) and information retrieval and extraction solutions difficult to adapt to existing MOD curation workflows, but many also have high error rates or are unable to process documents available in those formats preferred by scientific journals.In September 2008, Mouse Genome Informatics (MGI) at The Jackson Laboratory initiated a search for dictionary-based text mining tools that we could integrate into our biocuration workflow. MGI has rigorous document triage and annotation procedures designed to identify appropriate articles about mouse genetics and genome biology. We currently screen approximately 1000 journal articles a month for Gene Ontology terms, gene mapping, gene expression, phenotype data and other key biological information. Although we do not foresee that curation tasks will ever be fully automated, we are eager to implement named entity recognition (NER) tools for gene tagging that can help streamline our curation workflow and simplify gene indexing tasks within the MGI system. Gene indexing is an MGI-specific curation function that involves identifying which mouse genes are being studied in an article, then associating the appropriate gene symbols with the article reference number in the MGI database.Here, we discuss our search process, performance metrics and success criteria, and how we identified a short list of potential text mining tools for further evaluation. We provide an overview of our pilot projects with NCBO's Open Biomedical Annotator and Fraunhofer SCAI's ProMiner. In doing so, we prove the potential for the further incorporation of semi-automated processes into the curation of the biomedical literature.

  11. Authentication systems for securing clinical documentation workflows. A systematic literature review.

    PubMed

    Schwartze, J; Haarbrandt, B; Fortmeier, D; Haux, R; Seidel, C

    2014-01-01

    Integration of electronic signatures embedded in health care processes in Germany challenges health care service and supply facilities. The suitability of the signature level of an eligible authentication procedure is confirmed for a large part of documents in clinical practice. However, the concrete design of such a procedure remains unclear. To create a summary of usable user authentication systems suitable for clinical workflows. A Systematic literature review based on nine online bibliographic databases. Search keywords included authentication, access control, information systems, information security and biometrics with terms user authentication, user identification and login in title or abstract. Searches were run between 7 and 12 September 2011. Relevant conference proceedings were searched manually in February 2013. Backward reference search of selected results was done. Only publications fully describing authentication systems used or usable were included. Algorithms or purely theoretical concepts were excluded. Three authors did selection independently. DATA EXTRACTION AND ASSESSMENT: Semi-structured extraction of system characteristics was done by the main author. Identified procedures were assessed for security and fulfillment of relevant laws and guidelines as well as for applicability. Suitability for clinical workflows was derived from the assessments using a weighted sum proposed by Bonneau. Of 7575 citations retrieved, 55 publications meet our inclusion criteria. They describe 48 different authentication systems; 39 were biometric and nine graphical password systems. Assessment of authentication systems showed high error rates above European CENELEC standards and a lack of applicability of biometric systems. Graphical passwords did not add overall value compared to conventional passwords. Continuous authentication can add an additional layer of safety. Only few systems are suitable partially or entirely for use in clinical processes. Suitability strongly depends on national or institutional requirements. Four authentication systems seem to fulfill requirements of authentication procedures for clinical workflows. Research is needed in the area of continuous authentication with biometric methods. A proper authentication system should combine all factors of authentication implementing and connecting secure individual measures.

  12. Canine neuroanatomy: Development of a 3D reconstruction and interactive application for undergraduate veterinary education.

    PubMed

    Raffan, Hazel; Guevar, Julien; Poyade, Matthieu; Rea, Paul M

    2017-01-01

    Current methods used to communicate and present the complex arrangement of vasculature related to the brain and spinal cord is limited in undergraduate veterinary neuroanatomy training. Traditionally it is taught with 2-dimensional (2D) diagrams, photographs and medical imaging scans which show a fixed viewpoint. 2D representations of 3-dimensional (3D) objects however lead to loss of spatial information, which can present problems when translating this to the patient. Computer-assisted learning packages with interactive 3D anatomical models have become established in medical training, yet equivalent resources are scarce in veterinary education. For this reason, we set out to develop a workflow methodology creating an interactive model depicting the vasculature of the canine brain that could be used in undergraduate education. Using MR images of a dog and several commonly available software programs, we set out to show how combining image editing, segmentation and surface generation, 3D modeling and texturing can result in the creation of a fully interactive application for veterinary training. In addition to clearly identifying a workflow methodology for the creation of this dataset, we have also demonstrated how an interactive tutorial and self-assessment tool can be incorporated into this. In conclusion, we present a workflow which has been successful in developing a 3D reconstruction of the canine brain and associated vasculature through segmentation, surface generation and post-processing of readily available medical imaging data. The reconstructed model was implemented into an interactive application for veterinary education that has been designed to target the problems associated with learning neuroanatomy, primarily the inability to visualise complex spatial arrangements from 2D resources. The lack of similar resources in this field suggests this workflow is original within a veterinary context. There is great potential to explore this method, and introduce a new dimension into veterinary education and training.

  13. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  14. Sample data processing in an additive and reproducible taxonomic workflow by using character data persistently linked to preserved individual specimens

    PubMed Central

    Kilian, Norbert; Henning, Tilo; Plitzner, Patrick; Müller, Andreas; Güntsch, Anton; Stöver, Ben C.; Müller, Kai F.; Berendsohn, Walter G.; Borsch, Thomas

    2015-01-01

    We present the model and implementation of a workflow that blazes a trail in systematic biology for the re-usability of character data (data on any kind of characters of pheno- and genotypes of organisms) and their additivity from specimen to taxon level. We take into account that any taxon characterization is based on a limited set of sampled individuals and characters, and that consequently any new individual and any new character may affect the recognition of biological entities and/or the subsequent delimitation and characterization of a taxon. Taxon concepts thus frequently change during the knowledge generation process in systematic biology. Structured character data are therefore not only needed for the knowledge generation process but also for easily adapting characterizations of taxa. We aim to facilitate the construction and reproducibility of taxon characterizations from structured character data of changing sample sets by establishing a stable and unambiguous association between each sampled individual and the data processed from it. Our workflow implementation uses the European Distributed Institute of Taxonomy Platform, a comprehensive taxonomic data management and publication environment to: (i) establish a reproducible connection between sampled individuals and all samples derived from them; (ii) stably link sample-based character data with the metadata of the respective samples; (iii) record and store structured specimen-based character data in formats allowing data exchange; (iv) reversibly assign sample metadata and character datasets to taxa in an editable classification and display them and (v) organize data exchange via standard exchange formats and enable the link between the character datasets and samples in research collections, ensuring high visibility and instant re-usability of the data. The workflow implemented will contribute to organizing the interface between phylogenetic analysis and revisionary taxonomic or monographic work. Database URL: http://campanula.e-taxonomy.net/ PMID:26424081

  15. Genarris: Random generation of molecular crystal structures and fast screening with a Harris approximation

    NASA Astrophysics Data System (ADS)

    Li, Xiayue; Curtis, Farren S.; Rose, Timothy; Schober, Christoph; Vazquez-Mayagoitia, Alvaro; Reuter, Karsten; Oberhofer, Harald; Marom, Noa

    2018-06-01

    We present Genarris, a Python package that performs configuration space screening for molecular crystals of rigid molecules by random sampling with physical constraints. For fast energy evaluations, Genarris employs a Harris approximation, whereby the total density of a molecular crystal is constructed via superposition of single molecule densities. Dispersion-inclusive density functional theory is then used for the Harris density without performing a self-consistency cycle. Genarris uses machine learning for clustering, based on a relative coordinate descriptor developed specifically for molecular crystals, which is shown to be robust in identifying packing motif similarity. In addition to random structure generation, Genarris offers three workflows based on different sequences of successive clustering and selection steps: the "Rigorous" workflow is an exhaustive exploration of the potential energy landscape, the "Energy" workflow produces a set of low energy structures, and the "Diverse" workflow produces a maximally diverse set of structures. The latter is recommended for generating initial populations for genetic algorithms. Here, the implementation of Genarris is reported and its application is demonstrated for three test cases.

  16. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    PubMed

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project.

  17. Connecting proteins with drug-like compounds: Open source drug discovery workflows with BindingDB and KNIME

    PubMed Central

    Berthold, Michael R.; Hedrick, Michael P.; Gilson, Michael K.

    2015-01-01

    Today’s large, public databases of protein–small molecule interaction data are creating important new opportunities for data mining and integration. At the same time, new graphical user interface-based workflow tools offer facile alternatives to custom scripting for informatics and data analysis. Here, we illustrate how the large protein-ligand database BindingDB may be incorporated into KNIME workflows as a step toward the integration of pharmacological data with broader biomolecular analyses. Thus, we describe a collection of KNIME workflows that access BindingDB data via RESTful webservices and, for more intensive queries, via a local distillation of the full BindingDB dataset. We focus in particular on the KNIME implementation of knowledge-based tools to generate informed hypotheses regarding protein targets of bioactive compounds, based on notions of chemical similarity. A number of variants of this basic approach are tested for seven existing drugs with relatively ill-defined therapeutic targets, leading to replication of some previously confirmed results and discovery of new, high-quality hits. Implications for future development are discussed. Database URL: www.bindingdb.org PMID:26384374

  18. Schedule-Aware Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  19. Supporting Community Pharmacies with Implementation of a Web-Based Medication Management Application.

    PubMed

    Turner, Kea; Renfro, Chelsea; Ferreri, Stefanie; Roberts, Kim; Pfeiffenberger, Trista; Shea, Christopher M

    2018-04-01

     Community pharmacists' role in clinical care is expanding in the United States and information systems are needed that extend beyond a dispensing workflow. As pharmacies adopt new systems, implementation support will be needed. This study identifies the barriers and facilitators experienced by community pharmacies in implementing a Web-based medication management application and describes the implementation strategies used to support these pharmacies.  Semistructured interviews were conducted with 28 program and research staff that provides support to community pharmacies participating in a statewide pharmacy network. Interviews were recorded, transcribed verbatim, and analyzed for themes using the Expert Recommendations for Implementing Change (ERIC).  Findings suggest that leadership support, clinical training, and computer literacy facilitated implementation, while lack of system integration, staff resistance to change, and provider reluctance to share data served as barriers. To overcome the barriers, implementation support was provided, such as assessing readiness for implementation, developing a standardized and interoperable care plan, and audit and feedback of documentation quality.  Participants used a wide array of strategies to support community pharmacies with implementation and tailored approaches to accommodate pharmacy-specific preferences. Most of the support was delivered preimplementation or in the early phase of implementation and by program or research staff rather than peer-to-peer. Implementing new pharmacy information system requires a significant amount of implementation support to help end-users learn about program features, how to integrate the software into workflow, and how to optimize the software to improve patient care. Future research should identify which implementation strategies are associated with program performance. Schattauer.

  20. Work Flow Analysis Report Action Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PETERMANN, M.L.

    The Work Flow Analysis Report will be used to facilitate the requirements for implementing the further deployment of the Action Tracking module of Passport. The report consists of workflow integration processes for Action Tracking.

  1. Time analysis for optimization of radiology workflow in conventional radiology during RIS-PACS-integration

    NASA Astrophysics Data System (ADS)

    Falkensammer, Peter; Soegner, Peter I.; zur Nedden, Dieter

    2002-05-01

    The integration of RIS-PACS systems in radiology units are intended to reduce time consumption in radiology workflow and thus to increase radiologist productivity. Along with the RIS-PACS integration at the University Hospital Innsbruck we analyzed workflow from patient admission to release of final reports before implementation. The follow up study after six months of the implementation is currently in work. In this study we compared chest to skeletal x-ray examinations in 969 patients before the implementation. Drawing the admission-to-release-of-final-report period showed a two-peak diagram with the first peak corresponding to a release of final results on the same day and the second peak to a release on the following day. In the chest x-ray group, 57% were released the same day (mean value 4:02 hours) and 43% the next day (mean value 21:47 hours). Looking at the skeletal x-rays 40% were released the same day (mean value 3:58 hours) and 60% were released the next day (mean value 21:05 hours). Summarizing the results we should say, that the average chest x-ray requires less time than an skeletal x-ray, due to the fact that a greater percentage of reports is released the same day. The most important result is, that the most time consuming workstep is the exchange of data media between radiologist and secretary with at least 5 hours.

  2. Electronic workflow for imaging in clinical research.

    PubMed

    Hedges, Rebecca A; Goodman, Danielle; Sachs, Peter B

    2014-08-01

    In the transition from paper to electronic workflow, the University of Colorado Health System's implementation of a new electronic health record system (EHR) forced all clinical groups to reevaluate their practices including the infrastructure surrounding clinical trials. Radiological imaging is an important piece of many clinical trials and requires a high level of consistency and standardization. With EHR implementation, paper orders were manually transcribed into the EHR, digitizing an inefficient work flow. A team of schedulers, radiologists, technologists, research personnel, and EHR analysts worked together to optimize the EHR to accommodate the needs of research imaging protocols. The transition to electronic workflow posed several problems: (1) there needed to be effective communication throughout the imaging process from scheduling to radiologist interpretation. (2) The exam ordering process needed to be automated to allow scheduling of specific research studies on specific equipment. (3) The billing process needed to be controlled to accommodate radiologists already supported by grants. (4) There needed to be functionality allowing exams to finalize automatically skipping the PACS and interpretation process. (5) There needed to be a way to alert radiologists that a specialized research interpretation was needed on a given exam. These issues were resolved through the optimization of the "visit type," allowing a high-level control of an exam at the time of scheduling. Additionally, we added columns and fields to work queues displaying grant identification numbers. The build solutions we implemented reduced the mistakes made and increased imaging quality and compliance.

  3. Glocal Clinical Registries: Pacemaker Registry Design and Implementation for Global and Local Integration – Methodology and Case Study

    PubMed Central

    da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V.

    2013-01-01

    Background The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Purpose Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. Methods and Results We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. Conclusion This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries. PMID:23936257

  4. rCAD: A Novel Database Schema for the Comparative Analysis of RNA.

    PubMed

    Ozer, Stuart; Doshi, Kishore J; Xu, Weijia; Gutell, Robin R

    2011-12-31

    Beyond its direct involvement in protein synthesis with mRNA, tRNA, and rRNA, RNA is now being appreciated for its significance in the overall metabolism and regulation of the cell. Comparative analysis has been very effective in the identification and characterization of RNA molecules, including the accurate prediction of their secondary structure. We are developing an integrative scalable data management and analysis system, the RNA Comparative Analysis Database (rCAD), implemented with SQL Server to support RNA comparative analysis. The platformagnostic database schema of rCAD captures the essential relationships between the different dimensions of information for RNA comparative analysis datasets. The rCAD implementation enables a variety of comparative analysis manipulations with multiple integrated data dimensions for advanced RNA comparative analysis workflows. In this paper, we describe details of the rCAD schema design and illustrate its usefulness with two usage scenarios.

  5. rCAD: A Novel Database Schema for the Comparative Analysis of RNA

    PubMed Central

    Ozer, Stuart; Doshi, Kishore J.; Xu, Weijia; Gutell, Robin R.

    2013-01-01

    Beyond its direct involvement in protein synthesis with mRNA, tRNA, and rRNA, RNA is now being appreciated for its significance in the overall metabolism and regulation of the cell. Comparative analysis has been very effective in the identification and characterization of RNA molecules, including the accurate prediction of their secondary structure. We are developing an integrative scalable data management and analysis system, the RNA Comparative Analysis Database (rCAD), implemented with SQL Server to support RNA comparative analysis. The platformagnostic database schema of rCAD captures the essential relationships between the different dimensions of information for RNA comparative analysis datasets. The rCAD implementation enables a variety of comparative analysis manipulations with multiple integrated data dimensions for advanced RNA comparative analysis workflows. In this paper, we describe details of the rCAD schema design and illustrate its usefulness with two usage scenarios. PMID:24772454

  6. RINGMesh: A programming library for developing mesh-based geomodeling applications

    NASA Astrophysics Data System (ADS)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  7. MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laub, S; Dunn, M; Galbreath, G

    2015-06-15

    Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or greenmore » according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient communication. Having built-in timelines allows easy prioritization of tasks and resources and facilitates effective time management.« less

  8. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  9. SimITK: visual programming of the ITK image-processing library within Simulink.

    PubMed

    Dickinson, Andrew W L; Abolmaesumi, Purang; Gobbi, David G; Mousavi, Parvin

    2014-04-01

    The Insight Segmentation and Registration Toolkit (ITK) is a software library used for image analysis, visualization, and image-guided surgery applications. ITK is a collection of C++ classes that poses the challenge of a steep learning curve should the user not have appropriate C++ programming experience. To remove the programming complexities and facilitate rapid prototyping, an implementation of ITK within a higher-level visual programming environment is presented: SimITK. ITK functionalities are automatically wrapped into "blocks" within Simulink, the visual programming environment of MATLAB, where these blocks can be connected to form workflows: visual schematics that closely represent the structure of a C++ program. The heavily templated C++ nature of ITK does not facilitate direct interaction between Simulink and ITK; an intermediary is required to convert respective data types and allow intercommunication. As such, a SimITK "Virtual Block" has been developed that serves as a wrapper around an ITK class which is capable of resolving the ITK data types to native Simulink data types. Part of the challenge surrounding this implementation involves automatically capturing and storing the pertinent class information that need to be refined from an initial state prior to being reflected within the final block representation. The primary result from the SimITK wrapping procedure is multiple Simulink block libraries. From these libraries, blocks are selected and interconnected to demonstrate two examples: a 3D segmentation workflow and a 3D multimodal registration workflow. Compared to their pure-code equivalents, the workflows highlight ITK usability through an alternative visual interpretation of the code that abstracts away potentially confusing technicalities.

  10. The Contextualized Technology Adaptation Process (CTAP): Optimizing Health Information Technology to Improve Mental Health Systems.

    PubMed

    Lyon, Aaron R; Wasse, Jessica Knaster; Ludwig, Kristy; Zachry, Mark; Bruns, Eric J; Unützer, Jürgen; McCauley, Elizabeth

    2016-05-01

    Health information technologies have become a central fixture in the mental healthcare landscape, but few frameworks exist to guide their adaptation to novel settings. This paper introduces the contextualized technology adaptation process (CTAP) and presents data collected during Phase 1 of its application to measurement feedback system development in school mental health. The CTAP is built on models of human-centered design and implementation science and incorporates repeated mixed methods assessments to guide the design of technologies to ensure high compatibility with a destination setting. CTAP phases include: (1) Contextual evaluation, (2) Evaluation of the unadapted technology, (3) Trialing and evaluation of the adapted technology, (4) Refinement and larger-scale implementation, and (5) Sustainment through ongoing evaluation and system revision. Qualitative findings from school-based practitioner focus groups are presented, which provided information for CTAP Phase 1, contextual evaluation, surrounding education sector clinicians' workflows, types of technologies currently available, and influences on technology use. Discussion focuses on how findings will inform subsequent CTAP phases, as well as their implications for future technology adaptation across content domains and service sectors.

  11. Acceleration for 2D time-domain elastic full waveform inversion using a single GPU card

    NASA Astrophysics Data System (ADS)

    Jiang, Jinpeng; Zhu, Peimin

    2018-05-01

    Full waveform inversion (FWI) is a challenging procedure due to the high computational cost related to the modeling, especially for the elastic case. The graphics processing unit (GPU) has become a popular device for the high-performance computing (HPC). To reduce the long computation time, we design and implement the GPU-based 2D elastic FWI (EFWI) in time domain using a single GPU card. We parallelize the forward modeling and gradient calculations using the CUDA programming language. To overcome the limitation of relatively small global memory on GPU, the boundary saving strategy is exploited to reconstruct the forward wavefield. Moreover, the L-BFGS optimization method used in the inversion increases the convergence of the misfit function. A multiscale inversion strategy is performed in the workflow to obtain the accurate inversion results. In our tests, the GPU-based implementations using a single GPU device achieve >15 times speedup in forward modeling, and about 12 times speedup in gradient calculation, compared with the eight-core CPU implementations optimized by OpenMP. The test results from the GPU implementations are verified to have enough accuracy by comparing the results obtained from the CPU implementations.

  12. Medical imaging informatics based solutions for human performance analytics

    NASA Astrophysics Data System (ADS)

    Verma, Sneha; McNitt-Gray, Jill; Liu, Brent J.

    2018-03-01

    For human performance analysis, extensive experimental trials are often conducted to identify the underlying cause or long-term consequences of certain pathologies and to improve motor functions by examining the movement patterns of affected individuals. Data collected for human performance analysis includes high-speed video, surveys, spreadsheets, force data recordings from instrumented surfaces etc. These datasets are recorded from various standalone sources and therefore captured in different folder structures as well as in varying formats depending on the hardware configurations. Therefore, data integration and synchronization present a huge challenge while handling these multimedia datasets specifically for large datasets. Another challenge faced by researchers is querying large quantity of unstructured data and to design feedbacks/reporting tools for users who need to use datasets at various levels. In the past, database server storage solutions have been introduced to securely store these datasets. However, to automate the process of uploading raw files, various file manipulation steps are required. In the current workflow, this file manipulation and structuring is done manually and is not feasible for large amounts of data. However, by attaching metadata files and data dictionaries with these raw datasets, they can provide information and structure needed for automated server upload. We introduce one such system for metadata creation for unstructured multimedia data based on the DICOM data model design. We will discuss design and implementation of this system and evaluate this system with data set collected for movement analysis study. The broader aim of this paper is to present a solutions space achievable based on medical imaging informatics design and methods for improvement in workflow for human performance analysis in a biomechanics research lab.

  13. Beyond health information technology: critical factors necessary for effective diabetes disease management.

    PubMed

    Ciemins, Elizabeth L; Coon, Patricia J; Fowles, Jinnet Briggs; Min, Sung-joon

    2009-05-01

    Electronic health records (EHRs) have been implemented throughout the United States with varying degrees of success. Past EHR implementation experiences can inform health systems planning to initiate new or expand existing EHR systems. Key "critical success factors," e.g., use of disease registries, workflow integration, and real-time clinical guideline support, have been identified but not fully tested in practice. A pre/postintervention cohort analysis was conducted on 495 adult patients selected randomly from a diabetes registry and followed for 6 years. Two intervention phases were evaluated: a "low-dose" period targeting primary care provider (PCP) and patient education followed by a "high-dose" EHR diabetes management implementation period, including a diabetes disease registry and office workflow changes, e.g., diabetes patient preidentification to facilitate real-time diabetes preventive care, disease management, and patient education. Across baseline, "low-dose," and "high-dose" postintervention periods, a significantly greater proportion of patients (a) achieved American Diabetes Association (ADA) guidelines for control of blood pressure (26.9 to 33.1 to 43.9%), glycosylated hemoglobin (48.5 to 57.5 to 66.8%), and low-density lipoprotein cholesterol (33.1 to 44.4 to 56.6%) and (b) received recommended preventive eye (26.2 to 36.4 to 58%), foot (23.4 to 40.3 to 66.9%), and renal (38.5 to 53.9 to 71%) examinations or screens. Implementation of a fully functional, specialized EHR combined with tailored office workflow process changes was associated with increased adherence to ADA guidelines, including risk factor control, by PCPs and their patients with diabetes. Incorporation of previously identified "critical success factors" potentially contributed to the success of the program, as did use of a two-phase approach. 2009 Diabetes Technology Society.

  14. Production experience with the ATLAS Event Service

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Calafiura, P.; Childers, T.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Compute Engine, and a growing number of HPC platforms. After briefly reviewing the concept and the architecture of the Event Service, we will report the status and experience gained in AES commissioning and production operations on supercomputers, and our plans for extending ES application beyond Geant4 simulation to other workflows, such as reconstruction and data analysis.

  15. The effects of hands-free communication device systems: communication changes in hospital organizations.

    PubMed

    Richardson, Joshua E; Ash, Joan S

    2010-01-01

    To analyze the effects that hands-free communication device (HCD) systems have on healthcare organizations from multiple user perspectives. This exploratory qualitative study recruited 26 subjects from multiple departments in two research sites located in Portland, Oregon: an academic medical center and a community hospital. Interview and observation data were gathered January through March, 2007. Data were analyzed using a grounded theory approach. Because this study was exploratory, data were coded and patterns identified until overall themes 'emerged'. Five themes arose: (1) Communication access-the perception that HCD systems provide fast and efficient communication that supports workflow; (2) Control-social and technical considerations associated with use of an HCD system; (3) Training-processes that should be used to improve use of the HCD system; (4) Organizational change-changes to organizational design and behavior caused by HCD system implementation; and (5) Environment and infrastructure-HCD system use within the context of physical workspaces. HCD systems improve communication access but users experience challenges integrating the system into workflow. Effective HCD use depends on how well organizations train users, adapt to changes brought about by HCD systems, and integrate HCD systems into physical surroundings.

  16. Requirements for Workflow-Based EHR Systems - Results of a Qualitative Study.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2016-01-01

    Today's high quality healthcare delivery strongly relies on efficient electronic health records (EHR). These EHR systems or in general healthcare IT-systems are usually developed in a static manner according to a given workflow. Hence, they are not flexible enough to enable access to EHR data and to execute individual actions within a consultation. This paper reports on requirements pointed by experts in the domain of diabetes mellitus to design a system for supporting dynamic workflows to serve personalization within a medical activity. Requirements were collected by means of expert interviews. These interviews completed a conducted triangulation approach, aimed to gather requirements for workflow-based EHR interactions. The data from the interviews was analyzed through a qualitative approach resulting in a set of requirements enhancing EHR functionality from the user's perspective. Requirements were classified according to four different categorizations: (1) process-related requirements, (2) information needs, (3) required functions, (4) non-functional requirements. Workflow related requirements were identified which should be considered when developing and deploying EHR systems.

  17. Camera-augmented mobile C-arm (CamC): A feasibility study of augmented reality imaging in the operating room.

    PubMed

    von der Heide, Anna Maria; Fallavollita, Pascal; Wang, Lejing; Sandner, Philipp; Navab, Nassir; Weidert, Simon; Euler, Ekkehard

    2018-04-01

    In orthopaedic trauma surgery, image-guided procedures are mostly based on fluoroscopy. The reduction of radiation exposure is an important goal. The purpose of this work was to investigate the impact of a camera-augmented mobile C-arm (CamC) on radiation exposure and the surgical workflow during a first clinical trial. Applying a workflow-oriented approach, 10 general workflow steps were defined to compare the CamC to traditional C-arms. The surgeries included were arbitrarily identified and assigned to the study. The evaluation criteria were radiation exposure and operation time for each workflow step and the entire surgery. The evaluation protocol was designed and conducted in a single-centre study. The radiation exposure was remarkably reduced by 18 X-ray shots 46% using the CamC while keeping similar surgery times. The intuitiveness of the system, its easy integration into the surgical workflow, and its great potential to reduce radiation have been demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  18. An ontology-based framework for bioinformatics workflows.

    PubMed

    Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer

    2007-01-01

    The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.

  19. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    NASA Astrophysics Data System (ADS)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  20. Design Recommendations for Pharmacogenomics Clinical Decision Support Systems

    PubMed Central

    Khelifi, Maher; Tarczy-Hornoch, Peter; Devine, Emily B.; Pratt, Wanda

    2017-01-01

    The use of pharmacogenomics (PGx) in clinical practice still faces challenges to fully adopt genetic information in targeting drug therapy. To incorporate genetics into clinical practice, many support the use of Pharmacogenomics Clinical Decision Support Systems (PGx-CDS) for medication prescriptions. This support was fueled by new guidelines to incorporate genetics for optimizing drug dosage and reducing adverse events. In addition, the complexity of PGx led to exploring CDS outside the paradigm of the basic CDS tools embedded in commercial electronic health records. Therefore, designing the right CDS is key to unleashing the full potential of pharmacogenomics and making it a part of clinicians’ daily workflow. In this work, we 1) identify challenges and barriers of the implementation of PGx-CDS in clinical settings, 2) develop a new design approach to CDS with functional characteristics that can improve the adoption of pharmacogenomics guidelines and thus patient safety, and 3) create design guidelines and recommendations for such PGx-CDS tools. PMID:28815136

  1. Barriers and facilitators to exchanging health information: a systematic review.

    PubMed

    Eden, Karen B; Totten, Annette M; Kassakian, Steven Z; Gorman, Paul N; McDonagh, Marian S; Devine, Beth; Pappas, Miranda; Daeges, Monica; Woods, Susan; Hersh, William R

    2016-04-01

    We conducted a systematic review of studies assessing facilitators and barriers to use of health information exchange (HIE). We searched MEDLINE, PsycINFO, CINAHL, and the Cochrane Library databases between January 1990 and February 2015 using terms related to HIE. English-language studies that identified barriers and facilitators of actual HIE were included. Data on study design, risk of bias, setting, geographic location, characteristics of the HIE, perceived barriers and facilitators to use were extracted and confirmed. Ten cross-sectional, seven multiple-site case studies, and two before-after studies that included data from several sources (surveys, interviews, focus groups, and observations of users) evaluated perceived barriers and facilitators to HIE use. The most commonly cited barriers to HIE use were incomplete information, inefficient workflow, and reports that the exchanged information that did not meet the needs of users. The review identified several facilitators to use. Incomplete patient information was consistently mentioned in the studies conducted in the US but not mentioned in the few studies conducted outside of the US that take a collective approach toward healthcare. Individual patients and practices in the US may exercise the right to participate (or not) in HIE which effects the completeness of patient information available to be exchanged. Workflow structure and user roles are key but understudied. We identified several facilitators in the studies that showed promise in promoting electronic health data exchange: obtaining more complete patient information; thoughtful workflow that folds in HIE; and inclusion of users early in implementation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. The architecture of enterprise hospital information system.

    PubMed

    Lu, Xudong; Duan, Huilong; Li, Haomin; Zhao, Chenhui; An, Jiye

    2005-01-01

    Because of the complexity of the hospital environment, there exist a lot of medical information systems from different vendors with incompatible structures. In order to establish an enterprise hospital information system, the integration among these heterogeneous systems must be considered. Complete integration should cover three aspects: data integration, function integration and workflow integration. However most of the previous design of architecture did not accomplish such a complete integration. This article offers an architecture design of the enterprise hospital information system based on the concept of digital neural network system in hospital. It covers all three aspects of integration, and eventually achieves the target of one virtual data center with Enterprise Viewer for users of different roles. The initial implementation of the architecture in the 5-year Digital Hospital Project in Huzhou Central hospital of Zhejiang Province is also described.

  3. Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment

    PubMed Central

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan

    2016-01-01

    Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058

  4. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    PubMed

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.

  5. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, C; Wessels, B; Hamilton, H

    2014-06-01

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of amore » number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT.« less

  6. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  7. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    PubMed Central

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  8. A DICOM Based Collaborative Platform for Real-Time Medical Teleconsultation on Medical Images.

    PubMed

    Maglogiannis, Ilias; Andrikos, Christos; Rassias, Georgios; Tsanakas, Panayiotis

    2017-01-01

    The paper deals with the design of a Web-based platform for real-time medical teleconsultation on medical images. The proposed platform combines the principles of heterogeneous Workflow Management Systems (WfMSs), the peer-to-peer networking architecture and the SPA (Single-Page Application) concept, to facilitate medical collaboration among healthcare professionals geographically distributed. The presented work leverages state-of-the-art features of the web to support peer-to-peer communication using the WebRTC (Web Real Time Communication) protocol and client-side data processing for creating an integrated collaboration environment. The paper discusses the technical details of implementation and presents the operation of the platform in practice along with some initial results.

  9. Secure encapsulation and publication of biological services in the cloud computing environment.

    PubMed

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  10. RefPrimeCouch—a reference gene primer CouchApp

    PubMed Central

    Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus

    2013-01-01

    To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html PMID:24368831

  11. RefPrimeCouch--a reference gene primer CouchApp.

    PubMed

    Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus

    2013-01-01

    To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html.

  12. 3D-printed Bioanalytical Devices

    PubMed Central

    Bishop, Gregory W; Satterwhite-Warden, Jennifer E; Kadimisetty, Karteek; Rusling, James F

    2016-01-01

    While 3D printing technologies first appeared in the 1980s, prohibitive costs, limited materials, and the relatively small number of commercially available printers confined applications mainly to prototyping for manufacturing purposes. As technologies, printer cost, materials, and accessibility continue to improve, 3D printing has found widespread implementation in research and development in many disciplines due to ease-of-use and relatively fast design-to-object workflow. Several 3D printing techniques have been used to prepare devices such as milli- and microfluidic flow cells for analyses of cells and biomolecules as well as interfaces that enable bioanalytical measurements using cellphones. This review focuses on preparation and applications of 3D-printed bioanalytical devices. PMID:27250897

  13. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    PubMed

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  14. HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis

    PubMed Central

    David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057

  15. Design and implementation of a general main axis controller for the ESO telescopes

    NASA Astrophysics Data System (ADS)

    Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas

    2012-09-01

    Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.

  16. Knowledge Annotations in Scientific Workflows: An Implementation in Kepler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gandara, Aida G.; Chin, George; Pinheiro Da Silva, Paulo

    2011-07-20

    Abstract. Scientic research products are the result of long-term collaborations between teams. Scientic workfows are capable of helping scientists in many ways including the collection of information as to howresearch was conducted, e.g. scientic workfow tools often collect and manage information about datasets used and data transformations. However,knowledge about why data was collected is rarely documented in scientic workflows. In this paper we describe a prototype system built to support the collection of scientic expertise that infuences scientic analysis. Through evaluating a scientic research eort underway at Pacific Northwest National Laboratory, we identied features that would most benefit PNNL scientistsmore » in documenting how and why they conduct their research making this information available to the entire team. The prototype system was built by enhancing the Kepler Scientic Work-flow System to create knowledge-annotated scientic workfows and topublish them as semantic annotations.« less

  17. Automatic system testing of a decision support system for insulin dosing using Google Android.

    PubMed

    Spat, Stephan; Höll, Bernhard; Petritsch, Georg; Schaupp, Lukas; Beck, Peter; Pieber, Thomas R

    2013-01-01

    Hyperglycaemia in hospitalized patients is a common and costly health care problem. The GlucoTab system is a mobile workflow and decision support system, aiming to facilitate efficient and safe glycemic control of non-critically ill patients. Being a medical device, the GlucoTab requires extensive and reproducible testing. A framework for high-volume, reproducible and automated system testing of the GlucoTab system was set up applying several Open Source tools for test automation and system time handling. The REACTION insulin titration protocol was investigated in a paper-based clinical trial (PBCT). In order to validate the GlucoTab system, data from this trial was used for simulation and system tests. In total, 1190 decision support action points were identified and simulated. Four data points (0.3%) resulted in a GlucoTab system error caused by a defective implementation. In 144 data points (12.1%), calculation errors of physicians and nurses in the PBCT were detected. The test framework was able to verify manual calculation of insulin doses and detect relatively many user errors and workflow anomalies in the PBCT data. This shows the high potential of the electronic decision support application to improve safety of implementation of an insulin titration protocol and workflow management system in clinical wards.

  18. DNAseq Workflow in a Diagnostic Context and an Example of a User Friendly Implementation.

    PubMed

    Wolf, Beat; Kuonen, Pierre; Dandekar, Thomas; Atlan, David

    2015-01-01

    Over recent years next generation sequencing (NGS) technologies evolved from costly tools used by very few, to a much more accessible and economically viable technology. Through this recently gained popularity, its use-cases expanded from research environments into clinical settings. But the technical know-how and infrastructure required to analyze the data remain an obstacle for a wider adoption of this technology, especially in smaller laboratories. We present GensearchNGS, a commercial DNAseq software suite distributed by Phenosystems SA. The focus of GensearchNGS is the optimal usage of already existing infrastructure, while keeping its use simple. This is achieved through the integration of existing tools in a comprehensive software environment, as well as custom algorithms developed with the restrictions of limited infrastructures in mind. This includes the possibility to connect multiple computers to speed up computing intensive parts of the analysis such as sequence alignments. We present a typical DNAseq workflow for NGS data analysis and the approach GensearchNGS takes to implement it. The presented workflow goes from raw data quality control to the final variant report. This includes features such as gene panels and the integration of online databases, like Ensembl for annotations or Cafe Variome for variant sharing.

  19. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boddu, S; Morrow, A; Krishnamurthy, N

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality,more » undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.« less

  20. Proteomic Workflows for Biomarker Identification Using Mass Spectrometry — Technical and Statistical Considerations during Initial Discovery

    PubMed Central

    Orton, Dennis J.; Doucette, Alan A.

    2013-01-01

    Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400

  1. Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation

    PubMed Central

    Campbell, Robert James; Gantt, Laura; Congdon, Tamara

    2009-01-01

    This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533

  2. Understanding dental CAD/CAM for restorations--the digital workflow from a mechanical engineering viewpoint.

    PubMed

    Tapie, L; Lebon, N; Mawussi, B; Fron Chabouis, H; Duret, F; Attal, J-P

    2015-01-01

    As digital technology infiltrates every area of daily life, including the field of medicine, so it is increasingly being introduced into dental practice. Apart from chairside practice, computer-aided design/computer-aided manufacturing (CAD/CAM) solutions are available for creating inlays, crowns, fixed partial dentures (FPDs), implant abutments, and other dental prostheses. CAD/CAM dental solutions can be considered a chain of digital devices and software for the almost automatic design and creation of dental restorations. However, dentists who want to use the technology often do not have the time or knowledge to understand it. A basic knowledge of the CAD/CAM digital workflow for dental restorations can help dentists to grasp the technology and purchase a CAM/CAM system that meets the needs of their office. This article provides a computer-science and mechanical-engineering approach to the CAD/CAM digital workflow to help dentists understand the technology.

  3. Digital Workflow for Computer-Guided Implant Surgery in Edentulous Patients: A Case Report.

    PubMed

    Oh, Ji-Hyeon; An, Xueyin; Jeong, Seung-Mi; Choi, Byung-Ho

    2017-12-01

    The purpose of this article was to describe a fully digital workflow used to perform computer-guided flapless implant placement in an edentulous patient without the use of conventional impressions, models, or a radiographic guide. Digital data for the workflow were acquired using an intraoral scanner and cone-beam computed tomography (CBCT). The image fusion of the intraoral scan data and CBCT data was performed by matching resin markers placed in the patient's mouth. The definitive digital data were used to design a prosthetically driven implant position, surgical template, and computer-aided design and computer-aided manufacturing fabricated fixed dental prosthesis. The authors believe this is the first published case describing such a technique in computer-guided flapless implant surgery for edentulous patients. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  4. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  5. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  6. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    PubMed

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  8. Streamlining Workflow for Endovascular Mechanical Thrombectomy: Lessons Learned from a Comprehensive Stroke Center.

    PubMed

    Wang, Hongjin; Thevathasan, Arthur; Dowling, Richard; Bush, Steven; Mitchell, Peter; Yan, Bernard

    2017-08-01

    Recently, 5 randomized controlled trials confirmed the superiority of endovascular mechanical thrombectomy (EMT) to intravenous thrombolysis in acute ischemic stroke with large-vessel occlusion. The implication is that our health systems would witness an increasing number of patients treated with EMT. However, in-hospital delays, leading to increased time to reperfusion, are associated with poor clinical outcomes. This review outlines the in-hospital workflow of the treatment of acute ischemic stroke at a comprehensive stroke center and the lessons learned in reduction of in-hospital delays. The in-hospital workflow for acute ischemic stroke was described from prehospital notification to femoral arterial puncture in preparation for EMT. Systematic review of literature was also performed with PubMed. The implementation of workflow streamlining could result in reduction of in-hospital time delays for patients who were eligible for EMT. In particular, time-critical measures, including prehospital notification, the transfer of patients from door to computed tomography (CT) room, initiation of intravenous thrombolysis in the CT room, and the mobilization of neurointervention team in parallel with thrombolysis, all contributed to reduction in time delays. We have identified issues resulting in in-hospital time delays and have reported possible solutions to improve workflow efficiencies. We believe that these measures may help stroke centers initiate an EMT service for eligible patients. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  9. Design and implementation of an integrated, continuous evaluation, and quality improvement system for a state-based home-visiting program.

    PubMed

    McCabe, Bridget K; Potash, Dru; Omohundro, Ellen; Taylor, Cathy R

    2012-10-01

    To describe the design and implementation of an evaluation system to facilitate continuous quality improvement (CQI) and scientific evaluation in a statewide home visiting program, and to provide a summary of the system's progress in meeting intended outputs and short-term outcomes. Help Us Grow Successfully (HUGS) is a statewide home visiting program that provides services to at-risk pregnant/post-partum women, children (0-5 years), and their families. The program goals are to improve parenting skills and connect families to needed services and thus improve the health of the service population. The evaluation system is designed to: (1) integrate evaluation into daily workflow; (2) utilize standardized screening and evaluation tools; (3) facilitate a culture of CQI in program management; and, (4) facilitate scientifically rigorous evaluations. The review of the system's design and implementation occurred through a formative evaluation process (reach, dose, and fidelity). Data was collected through electronic and paper surveys, administrative data, and notes from management meetings, and medical chart review. In the design phase, four process and forty outcome measures were selected and are tracked using standardized screening and monitoring tools. During implementation, the reach and dose of training were adequate to successfully launch the evaluation/CQI system. All staff (n = 165) use the system for management of families; the supervisors (n = 18) use the system to track routine program activities. Data quality and availability is sufficient to support periodic program reviews at the region and state level. In the first 7 months, the HUGS evaluation system tracked 3,794 families (7,937 individuals). System use and acceptance is high. A successful implementation of a structured evaluation system with a strong CQI component is feasible in an existing, large statewide program. The evaluation/CQI system is an effective mechanism to drive modest change in management of the program.

  10. Global and local waveform simulations using the VERCE platform

    NASA Astrophysics Data System (ADS)

    Garth, Thomas; Saleh, Rafiq; Spinuso, Alessandro; Gemund, Andre; Casarotti, Emanuele; Magnoni, Federica; Krischner, Lion; Igel, Heiner; Schlichtweg, Horst; Frank, Anton; Michelini, Alberto; Vilotte, Jean-Pierre; Rietbrock, Andreas

    2017-04-01

    In recent years the potential to increase resolution of seismic imaging by full waveform inversion has been demonstrated on a range of scales from basin to continental scales. These techniques rely on harnessing the computational power of large supercomputers, and running large parallel codes to simulate the seismic wave field in a three-dimensional geological setting. The VERCE platform is designed to make these full waveform techniques accessible to a far wider spectrum of the seismological community. The platform supports the two widely used spectral element simulation programs SPECFEM3D Cartesian, and SPECFEM3D globe, allowing users to run a wide range of simulations. In the SPECFEM3D Cartesian implementation the user can run waveform simulations on a range of pre-loaded meshes and velocity models for specific areas, or upload their own velocity model and mesh. In the new SPECFEM3D globe implementation, the user will be able to select from a number of continent scale model regions, or perform waveform simulations for the whole earth. Earthquake focal mechanisms can be downloaded within the platform, for example from the GCMT catalogue, or users can upload their own focal mechanism catalogue through the platform. The simulations can be run on a range of European supercomputers in the PRACE network. Once a job has been submitted and run through the platform, the simulated waveforms can be manipulated or downloaded for further analysis. The misfit between the simulated and recorded waveforms can then be calculated through the platform through three interoperable workflows, for raw-data access (FDSN) and caching, pre-processing and finally misfit. The last workflow makes use of the Pyflex analysis software. In addition, the VERCE platform can be used to produce animations of waveform propagation through the velocity model, and synthetic shakemaps. All these data-products are made discoverable and re-usable thanks to the VERCE data and metadata management layer. We demonstrate the functionality of the VERCE platform with two use cases, one using the pre-loaded velocity model and mesh for the Maule area of Chile using the SPECFEM3D Cartesian workflow, and one showing the output of a global simulation using the SPECFEM3D globe workflow. It is envisioned that this tool will allow a much greater range of seismologists to access these full waveform inversion tools, and aid full waveform tomographic and source inversion, synthetic shakemap production and other full waveform applications, in a wide range of tectonic settings.

  11. Perceived Barriers and Facilitators of Using a Web-Based Interactive Decision Aid for Colorectal Cancer Screening in Community Practice Settings: Findings From Focus Groups With Primary Care Clinicians and Medical Office Staff

    PubMed Central

    2013-01-01

    Background Information is lacking about the capacity of those working in community practice settings to utilize health information technology for colorectal cancer screening. Objective To address this gap we asked those working in community practice settings to share their perspectives about how the implementation of a Web-based patient-led decision aid might affect patient-clinician conversations about colorectal cancer screening and the day-to-day clinical workflow. Methods Five focus groups in five community practice settings were conducted with 8 physicians, 1 physician assistant, and 18 clinic staff. Focus groups were organized using a semistructured discussion guide designed to identify factors that mediate and impede the use of a Web-based decision aid intended to clarify patient preferences for colorectal cancer screening and to trigger shared decision making during the clinical encounter. Results All physicians, the physician assistant, and 8 of the 18 clinic staff were active participants in the focus groups. Clinician and staff participants from each setting reported a belief that the Web-based patient-led decision aid could be an informative and educational tool; in all but one setting participants reported a readiness to recommend the tool to patients. The exception related to clinicians from one clinic who described a preference for patients having fewer screening choices, noting that a colonoscopy was the preferred screening modality for patients in their clinic. Perceived barriers to utilizing the Web-based decision aid included patients’ lack of Internet access or low computer literacy, and potential impediments to the clinics’ daily workflow. Expanding patients’ use of an online decision aid that is both easy to access and understand and that is utilized by patients outside of the office visit was described as a potentially efficient means for soliciting patients’ screening preferences. Participants described that a system to link the online decision aid to a computerized reminder system could promote a better understanding of patients’ screening preferences, though some expressed concern that such a system could be difficult to keep up and running. Conclusions Community practice clinicians and staff perceived the Web-based decision aid technology as promising but raised questions as to how the technology and resultant information would be integrated into their daily practice workflow. Additional research investigating how to best implement online decision aids should be conducted prior to the widespread adoption of such technology so as to maximize the benefits of the technology while minimizing workflow disruptions. PMID:24351420

  12. From MODFLOW-96 to MODFLOW-2005, ParFlow and Others: Updates and a Workflow for Up- and Out- Conversion

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Hardesty Lewis, D.

    2017-12-01

    MODFLOW (MF) has served for decades as a de facto standard for groundwater modelling. Despite successive versions, legacy MF-96 simulations are still commonly encountered cases. Such is the case for many of the groundwater availability models of the State of Texas. Unfortunately, even the existence of converters to MF's newer versions has not necessarily stimulated their adoption, let alone re-creation of legacy models. This state of affairs may be due to the unfamiliarity of the modeller with the terminal or the FORTRAN programming language, resulting in an inability to address the minor or major bugs, nuances, or limitations in compilation or execution of the conversion programs. Here, we present a workflow that addresses the above intricacies all the while attempting to maintain portability in implementation. This workflow is contructed in the form of a Bash script and - with the geoscience-oriented in mind - re-presented as a Jupyter notebook. First, one may choose whether this executable will run with POSIX-compliance or with a preference towards the Bash facilities, both widely adopted by operating systems. In the same vein, it attempts to function within minimal command environments, which reduces any dependencies. Finally, it is designed to offer parallelism across as many cores and nodes as necessary or as few as desired, whether upon a personal or super-computer. Underlying this workflow are patches such that antiquated tools may compile and execute upon modern hardware. Also, fixes to long-standing bugs and limitations in the existing MF converters have been prepared. Specifically, support for the conversion of -96- and Horizontal Flow Barrier-coupled simulations has been added. More radically, we have laid the foundations of a conversion utility between MF and a similar modeller, ParFlow. Furthermore, the modular approach followed may extend to an application which inter-operates between arbitrary groundwater simulators. In short, an accessible and portable workflow of the process of up-conversion between MODFLOW versions now avails itself to geoscientists. Updated programs within it may allow for re-use, in whole or in part, legacy simulations. Lastly, a generic inter-operator has been established, invoking the possibility of significant ease in the recycling of groundwater data in the future.

  13. Seamless Provenance Representation and Use in Collaborative Science Scenarios

    NASA Astrophysics Data System (ADS)

    Missier, P.; Ludaescher, B.; Bowers, S.; Altintas, I.; Anand, M. K.; Dey, S.; Sarkar, A.; Shrestha, B.; Goble, C.

    2010-12-01

    The notion of sharing scientific data has only recently begun to gain ground in science, where data is still considered a private asset. There is growing evidence, however, that the benefits of scientific collaboration through early data sharing during the course of a science project may outgrow the risk of losing exclusive ownership of the data. As exemplar success stories are making the headlines[1], principles of effective information sharing have become the subject of e-science research. In particular, any piece of published data should be self-describing, to the extent necessary for consumers to determine its suitability for reuse in their own projects. This is accomplished by associating a body of formally specified and machine-processable metadata to the data. When data is produced and reused by independent groups, however, metadata interoperability issues emerge. This is the case for provenance, a form of metadata that describes the history of a data product, Y. Provenance is typically expressed as a graph-structured set of dependencies that account for the sequence of computational or interactive steps that led to Y, often starting from some primary, observational data. Traversing dependency graphs is one of the mechanisms used to answer questions on data reliability. In the context of the NSF DataONE project[2], we have been studying issues of provenance interoperability in scientific collaboration scenarios. Consider a first scientist, Alice, who publishes a data product X along with its provenance, and a second scientist who further transforms X into a new product Y, also along with its provenance. A third scientist, who is interested in Y, expects to be able to trace Y's history up to the inputs used by Alice. This is only possible, however, if provenance accumulates into a single, uniform graph that can be seamlessly traversed. This becomes problematic when provenance is captured using different tools and computational models (i.e. workflow systems), as well as when data is published and reused using mechanisms that are not provenance-aware. In this presentation we discuss requirements for ensuring provenance-aware data publishing and reuse, and describe the design and implementation of a prototype toolkit that involves two specific, and broadly used, workflow models, Kepler [3] and Taverna [4]. The implementation is expected to be adopted as part of DataONE's investigators' toolkit, in support of its mission of large-scale data preservation. Refs. [1]Sharing of Data Leads to Progress on Alzheimer’s, G. Kolata, NYT, 8/12/2010 [2]http://www.dataone.org [3]Ludaescher B., Altintas I. et al. Scientific Workflow Management and the Kepler System. Special Issue: Workflow in Grid Systems. Concurrency and Computation: Practice & Experience 18(10): 1039-1065, 2006 [4]D. Hull, K. Wolstencroft, R. Stevens, C. Goble, M. R. Pocock, P. Li, T. Oinn. Taverna: a tool for building and running workflows of services. Nucl. Acids Res. 34: W729-W732, 2006

  14. Highlights of X-Stack ExM Deliverable: MosaStore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripeanu, Matei

    2016-07-20

    This brief report highlights the experience gained with MosaStore, an exploratory part of the X-Stack project “ExM: System support for extreme-scale, many-task applications”. The ExM project proposed to use concurrent workflows supported by the Swift language and runtime as an innovative programming model to exploit parallelism in exascale computers. MosaStore aims to support this endeavor by improving storage support for workflow-based applications, more precisely by exploring the gains that can be obtained from co-designing the storage system and the workflow runtime engine. MosaStore has been developed primarily at the University of British Columbia.

  15. Experimenting with semantic web services to understand the role of NLP technologies in healthcare.

    PubMed

    Jagannathan, V

    2006-01-01

    NLP technologies can play a significant role in healthcare where a predominant segment of the clinical documentation is in text form. In a graduate course focused on understanding semantic web services at West Virginia University, a class project was designed with the purpose of exploring potential use for NLP-based abstraction of clinical documentation. The role of NLP-technology was simulated using human abstractors and various workflows were investigated using public domain workflow and semantic web service technologies. This poster explores the potential use of NLP and the role of workflow and semantic web technologies in developing healthcare IT environments.

  16. Repeat rates in digital chest radiography and strategies for improvement.

    PubMed

    Fintelmann, Florian; Pulli, Benjamin; Abedi-Tari, Faezeh; Trombley, Maureen; Shore, Mary-Theresa; Shepard, Jo-Anne; Rosenthal, Daniel I

    2012-05-01

    To determine the repeat rate (RR) of chest radiographs acquired with portable computed radiography (CR) and installed direct radiography (DR) and to develop and assess strategies designed to decrease the RR. The RR and reasons for repeated digital chest radiographs were documented over the course of 16 months while a task force of thoracic radiologists, technologist supervisors, technologists, and information technology specialists continued to examine the workflow for underlying causes. Interventions decreasing the RR were designed and implemented. The initial RR of digital chest radiographs was 3.6% (138/3818) for portable CR and 13.3% (476/3575) for installed DR systems. By combining RR measurement with workflow analysis, targets for technical and teaching interventions were identified. The interventions decreased the RR to 1.8% (81/4476) for portable CR and to 8.2% (306/3748) for installed DR. We found the RR of direct digital chest radiography to be significantly higher than that of computed chest radiography. We believe this is due to the ease with which repeat images can be obtained and discarded, and it suggests the need for ongoing surveillance of RR. We were able to demonstrate that strategies to lower the RR, which had been developed in the era of film-based imaging, can be adapted to the digital environment. On the basis of our findings, we encourage radiologists to assess their own departmental RRs for direct digital chest radiography and to consider similar interventions if necessary to achieve acceptable RRs for this modality.

  17. Structuring research methods and data with the research object model: genomics workflows as a case study.

    PubMed

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  18. Implementation of a software for REmote COMparison of PARticlE and photon treatment plans: ReCompare.

    PubMed

    Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin

    2015-09-01

    To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.

  19. SWS: accessing SRS sites contents through Web Services.

    PubMed

    Romano, Paolo; Marra, Domenico

    2008-03-26

    Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.

  20. Dynamic Data Citation through Provenance - new approach for reproducible science in Geoscience Australia.

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Car, N.

    2017-12-01

    Geoscience Australia (GA) is recognised and respected as the National Repository and steward of multiple nationally significance data collections that provides geoscience information, services and capability to the Australian Government, industry and stakeholders. Internally, this brings a challenge of managing large volume (11 PB) of diverse and highly complex data distributed through a significant number of catalogues, applications, portals, virtual laboratories, and direct downloads from multiple locations. Externally, GA is facing constant changer in the Government regulations (e.g. open data and archival laws), growing stakeholder demands for high quality and near real-time delivery of data and products, and rapid technological advances enabling dynamic data access. Traditional approach to citing static data and products cannot satisfy increasing demands for the results from scientific workflows, or items within the workflows to be open, discoverable, thrusted and reproducible. Thus, citation of data, products, codes and applications through the implementation of provenance records is being implemented. This approach involves capturing the provenance of many GA processes according to a standardised data model and storing it, as well as metadata for the elements it references, in a searchable set of systems. This provides GA with ability to cite workflows unambiguously as well as each item within each workflow, including inputs and outputs and many other registered components. Dynamic objects can therefore be referenced flexibly in relation to their generation process - a dataset's metadata indicates where to obtain its provenance from - meaning the relevant facts of its dynamism need not be crammed into a single citation object with a single set of attributes. This allows for simple citations, similar to traditional static document citations such as references in journals, to be used for complex dynamic data and other objects such as software code.

  1. Wireless-PDA-controlled image workflow from PACS: the next trend in the health care enterprise?

    NASA Astrophysics Data System (ADS)

    Erberich, Stephan G.; Documet, Jorge; Zhou, Michael Z.; Cao, Fei; Liu, Brent J.; Mogel, Greg T.; Huang, H. K.

    2003-05-01

    Image workflow in today's Picture Archiving and Communication Systems (PACS) is controlled from fixed Display Workstations (DW) using proprietary control interfaces. A remote access to the Hospital Information System (HIS) and Radiology Information System (RIS) for urgent patient information retrieval does not exist or gradually become available. The lack for remote access and workflow control for HIS and RIS is especially true when it comes to medical images of a PACS on Department or Hospital level. As images become more complex and data sizes expand rapidly with new image techniques like functional MRI, Mammography or routine spiral CT to name a few, the access and manageability becomes an important issue. Long image downloads or incomplete work lists cannot be tolerated in a busy health care environment. In addition, the domain of the PACS is no longer limited to the imaging department and PACS is also being used in the ER and emergency care units. Thus a prompt and secure access and manageability not only by the radiologist, but also from the physician becomes crucial to optimally utilize the PACS in the health care enterprise of the new millennium. The purpose of this paper is to introduce a concept and its implementation of a remote access and workflow control of the PACS combining wireless, Internet and Internet2 technologies. A wireless device, the Personal Digital Assistant (PDA), is used to communicate to a PACS web server that acts as a gateway controlling the commands for which the user has access to the PACS server. The commands implemented for this test-bed are query/retrieve of the patient list and study list including modality, examination, series and image selection and pushing any list items to a selected DW on the PACS network.

  2. An Efficient Workflow Environment to Support the Collaborative Development of Actionable Climate Information Using the NCAR Climate Risk Management Engine (CRMe)

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Vigh, J. L.; Lee, J. A.

    2016-12-01

    Society's growing needs for robust and relevant climate information have fostered an explosion in tools and frameworks for processing climate projections. Many top-down workflows might be employed to generate sets of pre-computed data and plots, frequently served in a "loading-dock style" through a metadata-enabled search and discovery engine. Despite these increasing resources, the diverse needs of applications-driven projects often result in data processing workflow requirements that cannot be fully satisfied using past approaches. In parallel to the data processing challenges, the provision of climate information to users in a form that is also usable represents a formidable challenge of its own. Finally, many users do not have the time nor the desire to synthesize and distill massive volumes of climate information to find the relevant information for their particular application. All of these considerations call for new approaches to developing actionable climate information. CRMe seeks to bridge the gap between the diversity and richness of bottom-up needs of practitioners, with discrete, structured top-down workflows typically implemented for rapid delivery. Additionally, CRMe has implemented web-based data services capable of providing focused climate information in usable form for a given location, or as spatially aggregated information for entire regions or countries following the needs of users and sectors. Making climate data actionable also involves summarizing and presenting it in concise and approachable ways. CRMe is developing the concept of dashboards, co-developed with the users, to condense the key information into a quick summary of the most relevant, curated climate data for a given discipline, application, or location, while still enabling users to efficiently conduct deeper discovery into rich datasets on an as-needed basis.

  3. MG-RAST version 4-lessons learned from a decade of low-budget ultra-high-throughput metagenome analysis.

    PubMed

    Meyer, Folker; Bagchi, Saurabh; Chaterji, Somali; Gerlach, Wolfgang; Grama, Ananth; Harrison, Travis; Paczian, Tobias; Trimble, William L; Wilke, Andreas

    2017-09-26

    As technologies change, MG-RAST is adapting. Newly available software is being included to improve accuracy and performance. As a computational service constantly running large volume scientific workflows, MG-RAST is the right location to perform benchmarking and implement algorithmic or platform improvements, in many cases involving trade-offs between specificity, sensitivity and run-time cost. The work in [Glass EM, Dribinsky Y, Yilmaz P, et al. ISME J 2014;8:1-3] is an example; we use existing well-studied data sets as gold standards representing different environments and different technologies to evaluate any changes to the pipeline. Currently, we use well-understood data sets in MG-RAST as platform for benchmarking. The use of artificial data sets for pipeline performance optimization has not added value, as these data sets are not presenting the same challenges as real-world data sets. In addition, the MG-RAST team welcomes suggestions for improvements of the workflow. We are currently working on versions 4.02 and 4.1, both of which contain significant input from the community and our partners that will enable double barcoding, stronger inferences supported by longer-read technologies, and will increase throughput while maintaining sensitivity by using Diamond and SortMeRNA. On the technical platform side, the MG-RAST team intends to support the Common Workflow Language as a standard to specify bioinformatics workflows, both to facilitate development and efficient high-performance implementation of the community's data analysis tasks. Published by Oxford University Press on behalf of Entomological Society of America 2017. This work is written by US Government employees and is in the public domain in the US.

  4. Chronic care coordination by integrating care through a team-based, population-driven approach: a case study.

    PubMed

    van Eeghen, Constance O; Littenberg, Benjamin; Kessler, Rodger

    2018-05-23

    Patients with chronic conditions frequently experience behavioral comorbidities to which primary care cannot easily respond. This study observed a Vermont family medicine practice with integrated medical and behavioral health services that use a structured approach to implement a chronic care management system with Lean. The practice chose to pilot a population-based approach to improve outcomes for patients with poorly controlled Type 2 diabetes using a stepped-care model with an interprofessional team including a community health nurse. This case study observed the team's use of Lean, with which it designed and piloted a clinical algorithm composed of patient self-assessment, endorsement of behavioral goals, shared documentation of goals and plans, and follow-up. The team redesigned workflows and measured reach (patients who engaged to the end of the pilot), outcomes (HbA1c results), and process (days between HbA1c tests). The researchers evaluated practice member self-reports about the use of Lean and facilitators and barriers to move from pilot to larger scale applications. Of 20 eligible patients recruited over 3 months, 10 agreed to participate and 9 engaged fully (45%); 106 patients were controls. Relative to controls, outcomes and process measures improved but lacked significance. Practice members identified barriers that prevented implementation of all changes needed but were in agreement that the pilot produced useful outcomes. A systematized, population-based, chronic care management service is feasible in a busy primary care practice. To test at scale, practice leadership will need to allocate staffing, invest in shared documentation, and standardize workflows to streamline office practice responsibilities.

  5. High‐resolution trench photomosaics from image‐based modeling: Workflow and error analysis

    USGS Publications Warehouse

    Reitman, Nadine G.; Bennett, Scott E. K.; Gold, Ryan D.; Briggs, Richard; Duross, Christopher

    2015-01-01

    Photomosaics are commonly used to construct maps of paleoseismic trench exposures, but the conventional process of manually using image‐editing software is time consuming and produces undesirable artifacts and distortions. Herein, we document and evaluate the application of image‐based modeling (IBM) for creating photomosaics and 3D models of paleoseismic trench exposures, illustrated with a case‐study trench across the Wasatch fault in Alpine, Utah. Our results include a structure‐from‐motion workflow for the semiautomated creation of seamless, high‐resolution photomosaics designed for rapid implementation in a field setting. Compared with conventional manual methods, the IBM photomosaic method provides a more accurate, continuous, and detailed record of paleoseismic trench exposures in approximately half the processing time and 15%–20% of the user input time. Our error analysis quantifies the effect of the number and spatial distribution of control points on model accuracy. For this case study, an ∼87  m2 exposure of a benched trench photographed at viewing distances of 1.5–7 m yields a model with <2  cm root mean square error (rmse) with as few as six control points. Rmse decreases as more control points are implemented, but the gains in accuracy are minimal beyond 12 control points. Spreading control points throughout the target area helps to minimize error. We propose that 3D digital models and corresponding photomosaics should be standard practice in paleoseismic exposure archiving. The error analysis serves as a guide for future investigations that seek balance between speed and accuracy during photomosaic and 3D model construction.

  6. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  7. Selection and implementation of a distributed phased archive for a multivendor incremental approach to PACS

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The selection criteria for the archive were based on the objectives of the Medical Information, Communication and Archive System (MICAS), a multi-vendor incremental approach to PACS. These objectives include interoperability between all components, seamless integration of the Radiology Information System (RIS) with MICAS and eventually other hospital databases, all components must demonstrate DICOM compliance prior to acceptance and automated workflow that can be programmed to meet changes in the healthcare environment. The long-term multi-modality archive is being implemented in 3 or more phases with the first phase designed to provide a 12 to 18 month storage solution. This decision was made because the cost per GB of storage is rapidly decreasing and the speed at which data can be retrieved is increasing with time. The open-solution selected allows incorporation of leading edge, 'best of breed' hardware and software and provides maximum jukeboxes, provides maximum flexibility of workflow both within and outside of radiology. The selected solution is media independent, supports multiple jukeboxes, provides expandable storage capacity and will provide redundancy and fault tolerance at minimal cost. Some of the required attributes of the archive include scalable archive strategy, virtual image database with global query and object-oriented database. The selection process took approximately 10 months with Cemax-Icon being the vendor selected. Prior to signing a purchase order, Cemax-Icon performed a site survey, agreed upon the acceptance test protocol and provided a written guarantee of connectivity between their archive and the imaging modalities and other MICAS components.

  8. REPRODUCIBLE RESEARCH WORKFLOW IN R FOR THE ANALYSIS OF PERSONALIZED HUMAN MICROBIOME DATA.

    PubMed

    Callahan, Benjamin; Proctor, Diana; Relman, David; Fukuyama, Julia; Holmes, Susan

    2016-01-01

    This article presents a reproducible research workflow for amplicon-based microbiome studies in personalized medicine created using Bioconductor packages and the knitr markdown interface.We show that sometimes a multiplicity of choices and lack of consistent documentation at each stage of the sequential processing pipeline used for the analysis of microbiome data can lead to spurious results. We propose its replacement with reproducible and documented analysis using R packages dada2, knitr, and phyloseq. This workflow implements both key stages of amplicon analysis: the initial filtering and denoising steps needed to construct taxonomic feature tables from error-containing sequencing reads (dada2), and the exploratory and inferential analysis of those feature tables and associated sample metadata (phyloseq). This workow facilitates reproducible interrogation of the full set of choices required in microbiome studies. We present several examples in which we leverage existing packages for analysis in a way that allows easy sharing and modification by others, and give pointers to articles that depend on this reproducible workflow for the study of longitudinal and spatial series analyses of the vaginal microbiome in pregnancy and the oral microbiome in humans with healthy dentition and intra-oral tissues.

  9. An automated workflow for parallel processing of large multiview SPIM recordings

    PubMed Central

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-01-01

    Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585

  10. Oqtans: the RNA-seq workbench in the cloud for complete and reproducible quantitative transcriptome analysis.

    PubMed

    Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar

    2014-05-01

    We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at cloud.oqtans.org, (ii) a public Galaxy instance at galaxy.cbio.mskcc.org, (iii) a git repository containing all installed software (oqtans.org/git); most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan.

  11. The Behavioral Intervention Technology Model: An Integrated Conceptual and Technological Framework for eHealth and mHealth Interventions

    PubMed Central

    Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa

    2014-01-01

    A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or “elements” (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user’s environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs. PMID:24905070

  12. The behavioral intervention technology model: an integrated conceptual and technological framework for eHealth and mHealth interventions.

    PubMed

    Mohr, David C; Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa

    2014-06-05

    A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or "elements" (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user's environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs.

  13. How Memorial Hermann’s Online Payments Are Boosting Patient Loyalty and Revenue.

    PubMed

    Ramos Hegwer, Laura

    2016-01-01

    The Houston-based health system has implemented new workflows and technology in 14 of its hospitals and across its care delivery network to make the payment process more patient-friendly and build consumer loyalty.

  14. A Comprehensive Solution of the Problems of Ensuring the Strength of Gas Turbine Engine Compressor at the Design Stage

    NASA Astrophysics Data System (ADS)

    Vedeneev, V. V.; Kolotnikov, M. E.; Mossakovskii, P. A.; Kostyreva, L. A.; Abdukhakimov, F. A.; Makarov, P. V.; Pyhalov, A. A.; Dudaev, M. A.

    2018-01-01

    In this paper we present a complex numerical workflow for analysis of blade flutter and high-amplitude resonant oscillations, impenetrability of casing if the blade is broken off, and the rotor reaction to the blade detachment and following misbalance, with the assessment of a safe flight possibility at the auto-rotation regime. All the methods used are carefully verified by numerical convergence study and correlations with experiments. The use of the workflow developed significantly improves the efficiency of the design process of modern jet engine compressors. It ensures a significant reduction of time and cost of the compressor design with the required level of strength and durability.

  15. Human Systems Integration Design Environment (HSIDE)

    DTIC Science & Technology

    2012-04-09

    quality of the resulting HSI products. 15. SUBJECT TERMS HSI , Manning Estimation and Validation , Risk Assessment, I POE, PLM, BPMN , Workflow...business process model in Business Process Modeling Notation ( BPMN ) or the actual workflow template associated with the specific functional area, again...as filtered by the user settings in the high level interface. Figure 3 shows the initial screen which allows the user to select either the BPMN or

  16. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    NASA Astrophysics Data System (ADS)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  17. Utilization of design data on conventional system to building information modeling (BIM)

    NASA Astrophysics Data System (ADS)

    Akbar, Boyke M.; Z. R., Dewi Larasati

    2017-11-01

    Nowadays infrastructure development becomes one of the main priorities in the developed country such as Indonesia. The use of conventional design system is considered no longer effectively support the infrastructure projects, especially for the high complexity building design, due to its fragmented system issues. BIM comes as one of the solutions in managing projects in an integrated manner. Despite of the all known BIM benefits, there are some obstacles on the migration process to BIM. The two main of the obstacles are; the BIM implementation unpreparedness of some project parties and a concerns to leave behind the existing database and create a new one on the BIM system. This paper discusses the utilization probabilities of the existing CAD data from the conventional design system for BIM system. The existing conventional CAD data's and BIM design system output was studied to examine compatibility issues between two subject and followed by an utilization scheme-strategy probabilities. The goal of this study is to add project parties' eagerness in migrating to BIM by maximizing the existing data utilization and hopefully could also increase BIM based project workflow quality.

  18. Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan

    NASA Astrophysics Data System (ADS)

    Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.

    2015-08-01

    In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.

  19. Investigation into the need for ingesting foreign imaging exams into local systems and evaluation of the design challenges of Foreign Exam Management (FEM)

    NASA Astrophysics Data System (ADS)

    Milovanovic, Lazar; Agrawal, Arun; Bak, Peter; Bender, Duane; Koff, David

    2015-03-01

    The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. Most of these deployments have taken the approach of "sharing" imaging exams via portals and web-based viewers. The motivation of portal/web-based access is driven by a) the perception that review of imaging exams via portal methods is satisfactory to all users and b) the perceived complexity of ingesting foreign exams into local systems. This research project set out to objectively evaluate who really needs foreign exams within their local systems, what those systems might be and how often this is required. Working on the belief that Foreign Exam Management (FEM) is required to support clinical workflow, the project implemented a FEM capability within an XDSI. b domain to identify the design challenges and nuances associated with FEM.

  20. Evaluating the data completeness in the Electronic Health Record after the Implementation of an Outpatient Electronic Health Record.

    PubMed

    Soto, Mauricio; Capurro, Daniel; Catalán, Silvia

    2015-01-01

    Electronic health records (EHRs) present an opportunity for quality improvement in health organitations, particularly at the primary health level. However, EHR implementation impacts clinical workflows, and physicians frequently prefer to document in a non-structured way, which ultimately hinders the ability to measure quality indicators. We present an assessment of data completeness-a key data quality indicator-during the first 12 months after the implementation of an EHR at a teaching outpatient center in Santiago, Chile.

Top