Sample records for workflow navigation system

  1. Needle and catheter navigation using electromagnetic tracking for computer-assisted C-arm CT interventions

    NASA Astrophysics Data System (ADS)

    Nagel, Markus; Hoheisel, Martin; Petzold, Ralf; Kalender, Willi A.; Krause, Ulrich H. W.

    2007-03-01

    Integrated solutions for navigation systems with CT, MR or US systems become more and more popular for medical products. Such solutions improve the medical workflow, reduce hardware, space and costs requirements. The purpose of our project was to develop a new electromagnetic navigation system for interventional radiology which is integrated into C-arm CT systems. The application is focused on minimally invasive percutaneous interventions performed under local anaesthesia. Together with a vacuum-based patient immobilization device and newly developed navigation tools (needles, panels) we developed a safe and fully automatic navigation system. The radiologist can directly start with navigated interventions after loading images without any prior user interaction. The complete system is adapted to the requirements of the radiologist and to the clinical workflow. For evaluation of the navigation system we performed different phantom studies and achieved an average accuracy of better than 2.0 mm.

  2. Total Navigation in Spine Surgery; A Concise Guide to Eliminate Fluoroscopy Using a Portable Intraoperative Computed Tomography 3-Dimensional Navigation System.

    PubMed

    Navarro-Ramirez, Rodrigo; Lang, Gernot; Lian, Xiaofeng; Berlin, Connor; Janssen, Insa; Jada, Ajit; Alimi, Marjan; Härtl, Roger

    2017-04-01

    Portable intraoperative computed tomography (iCT) with integrated 3-dimensional navigation (NAV) offers new opportunities for more precise navigation in spinal surgery, eliminates radiation exposure for the surgical team, and accelerates surgical workflows. We present the concept of "total navigation" using iCT NAV in spinal surgery. Therefore, we propose a step-by-step guideline demonstrating how total navigation can eliminate fluoroscopy with time-efficient workflows integrating iCT NAV into daily practice. A prospective study was conducted on collected data from patients undergoing iCT NAV-guided spine surgery. Number of scans, radiation exposure, and workflow of iCT NAV (e.g., instrumentation, cage placement, localization) were documented. Finally, the accuracy of pedicle screws and time for instrumentation were determined. iCT NAV was successfully performed in 117 cases for various indications and in all regions of the spine. More than half (61%) of cases were performed in a minimally invasive manner. Navigation was used for skin incision, localization of index level, and verification of implant position. iCT NAV was used to evaluate neural decompression achieved in spinal fusion surgeries. Total navigation eliminates fluoroscopy in 75%, thus reducing staff radiation exposure entirely. The average times for iCT NAV setup and pedicle screw insertion were 12.1 and 3.1 minutes, respectively, achieving a pedicle screw accuracy of 99%. Total navigation makes spine surgery safer and more accurate, and it enhances efficient and reproducible workflows. Fluoroscopy and radiation exposure for the surgical staff can be eliminated in the majority of cases. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Navigated MRI-guided liver biopsies in a closed-bore scanner: experience in 52 patients.

    PubMed

    Moche, Michael; Heinig, Susann; Garnov, Nikita; Fuchs, Jochen; Petersen, Tim-Ole; Seider, Daniel; Brandmaier, Philipp; Kahn, Thomas; Busse, Harald

    2016-08-01

    To evaluate clinical effectiveness and diagnostic efficiency of a navigation device for MR-guided biopsies of focal liver lesions in a closed-bore scanner. In 52 patients, 55 biopsies were performed. An add-on MR navigation system with optical instrument tracking was used for image guidance and biopsy device insertion outside the bore. Fast control imaging allowed visualization of the true needle position at any time. The biopsy workflow and procedure duration were recorded. Histological analysis and clinical course/outcome were used to calculate sensitivity, specificity and diagnostic accuracy. Fifty-four of 55 liver biopsies were performed successfully with the system. No major and four minor complications occurred. Mean tumour size was 23 ± 14 mm and the skin-to-target length ranged from 22 to 177 mm. In 39 cases, access path was double oblique. Sensitivity, specificity and diagnostic accuracy were 88 %, 100 % and 92 %, respectively. The mean procedure time was 51 ± 12 min, whereas the puncture itself lasted 16 ± 6 min. On average, four control scans were taken. Using this navigation device, biopsies of poorly visible and difficult accessible liver lesions could be performed safely and reliably in a closed-bore MRI scanner. The system can be easily implemented in clinical routine workflow. • Targeted liver biopsies could be reliably performed in a closed-bore MRI. • The navigation system allows for image guidance outside of the scanner bore. • Assisted MRI-guided biopsies are helpful for focal lesions with a difficult access. • Successful integration of the method in clinical workflow was shown. • Subsequent system installation in an existing MRI environment is feasible.

  4. Towards An Understanding of Mobile Touch Navigation in a Stereoscopic Viewing Environment for 3D Data Exploration.

    PubMed

    López, David; Oehlberg, Lora; Doger, Candemir; Isenberg, Tobias

    2016-05-01

    We discuss touch-based navigation of 3D visualizations in a combined monoscopic and stereoscopic viewing environment. We identify a set of interaction modes, and a workflow that helps users transition between these modes to improve their interaction experience. In our discussion we analyze, in particular, the control-display space mapping between the different reference frames of the stereoscopic and monoscopic displays. We show how this mapping supports interactive data exploration, but may also lead to conflicts between the stereoscopic and monoscopic views due to users' movement in space; we resolve these problems through synchronization. To support our discussion, we present results from an exploratory observational evaluation with domain experts in fluid mechanics and structural biology. These experts explored domain-specific datasets using variations of a system that embodies the interaction modes and workflows; we report on their interactions and qualitative feedback on the system and its workflow.

  5. A navigation system for percutaneous needle interventions based on PET/CT images: design, workflow and error analysis of soft tissue and bone punctures.

    PubMed

    Oliveira-Santos, Thiago; Klaeser, Bernd; Weitzel, Thilo; Krause, Thomas; Nolte, Lutz-Peter; Peterhans, Matthias; Weber, Stefan

    2011-01-01

    Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well as highlighting the differences between bone and soft tissue punctures. An overall average error of 4.23 mm and 3.07 mm for bone and soft tissue punctures, respectively, demonstrated the feasibility of using this system for such interventions. The proposed system workflow was shown to be effective in separating the preparation from the sterile phase, as well as in keeping the system manageable by a single operator. Among the distinct sources of error, the user error based on the system accuracy (defined as the distance from the planned target to the actual needle tip) appeared to be the most significant. Bone punctures showed higher user error, whereas soft tissue punctures showed higher tissue deformation error.

  6. Implementation of a single sign-on system between practice, research and learning systems.

    PubMed

    Purkayastha, Saptarshi; Gichoya, Judy W; Addepally, Siva Abhishek

    2017-03-29

    Multiple specialized electronic medical systems are utilized in the health enterprise. Each of these systems has their own user management, authentication and authorization process, which makes it a complex web for navigation and use without a coherent process workflow. Users often have to remember multiple passwords, login/logout between systems that disrupt their clinical workflow. Challenges exist in managing permissions for various cadres of health care providers. This case report describes our experience of implementing a single sign-on system, used between an electronic medical records system and a learning management system at a large academic institution with an informatics department responsible for student education and a medical school affiliated with a hospital system caring for patients and conducting research. At our institution, we use OpenMRS for research registry tracking of interventional radiology patients as well as to provide access to medical records to students studying health informatics. To provide authentication across different users of the system with different permissions, we developed a Central Authentication Service (CAS) module for OpenMRS, released under the Mozilla Public License and deployed it for single sign-on across the academic enterprise. The module has been in implementation since August 2015 to present, and we assessed usability of the registry and education system before and after implementation of the CAS module. 54 students and 3 researchers were interviewed. The module authenticates users with appropriate privileges in the medical records system, providing secure access with minimal disruption to their workflow. No passwords requests were sent and users reported ease of use, with streamlined workflow. The project demonstrates that enterprise-wide single sign-on systems should be used in healthcare to reduce complexity like "password hell", improve usability and user navigation. We plan to extend this to work with other systems used in the health care enterprise.

  7. Next Generation Global Navigation Satellite Systems (GNSS) Processing at NASA CDDIS

    NASA Astrophysics Data System (ADS)

    Michael, B. P.; Noll, C. E.

    2016-12-01

    The Crustal Dynamics Data Information System (CDDIS) has been providing access to space geodesy and related data sets since 1982, and in particular, Global Navigation Satellite Systems (GNSS) data and derived products since 1992. The CDDIS became one of the Earth Observing System Data and Information System (EOSDIS) archive centers in 2007. As such, CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. With a growing understanding of the needs and goals of its science users CDDIS continues to improve these services. Due to the importance of GNSS data and derived products in scientific studies over the last decade, CDDIS has seen its ingest volume explode to over 30 million files per year or more than one file per second from over hundreds of simultaneous data providers. In order to accommodate this increase and to streamline operations and fully automate the workflow, CDDIS has recently updated the data submission process and GNSS processing. This poster will cover this new ingest infrastructure, workflow, and the agile techniques applied in its development and current operations.

  8. Augmented Reality Based Navigation for Computer Assisted Hip Resurfacing: A Proof of Concept Study.

    PubMed

    Liu, He; Auvinet, Edouard; Giles, Joshua; Rodriguez Y Baena, Ferdinando

    2018-05-23

    Implantation accuracy has a great impact on the outcomes of hip resurfacing such as recovery of hip function. Computer assisted orthopedic surgery has demonstrated clear advantages for the patients, with improved placement accuracy and fewer outliers, but the intrusiveness, cost, and added complexity have limited its widespread adoption. To provide seamless computer assistance with improved immersion and a more natural surgical workflow, we propose an augmented-reality (AR) based navigation system for hip resurfacing. The operative femur is registered by processing depth information from the surgical site with a commercial depth camera. By coupling depth data with robotic assistance, obstacles that may obstruct the femur can be tracked and avoided automatically to reduce the chance of disruption to the surgical workflow. Using the registration result and the pre-operative plan, intra-operative surgical guidance is provided through a commercial AR headset so that the user can perform the operation without additional physical guides. To assess the accuracy of the navigation system, experiments of guide hole drilling were performed on femur phantoms. The position and orientation of the drilled holes were compared with the pre-operative plan, and the mean errors were found to be approximately 2 mm and 2°, results which are in line with commercial computer assisted orthopedic systems today.

  9. A knowledge-based decision support system in bioinformatics: an application to protein complex extraction

    PubMed Central

    2013-01-01

    Background We introduce a Knowledge-based Decision Support System (KDSS) in order to face the Protein Complex Extraction issue. Using a Knowledge Base (KB) coding the expertise about the proposed scenario, our KDSS is able to suggest both strategies and tools, according to the features of input dataset. Our system provides a navigable workflow for the current experiment and furthermore it offers support in the configuration and running of every processing component of that workflow. This last feature makes our system a crossover between classical DSS and Workflow Management Systems. Results We briefly present the KDSS' architecture and basic concepts used in the design of the knowledge base and the reasoning component. The system is then tested using a subset of Saccharomyces cerevisiae Protein-Protein interaction dataset. We used this subset because it has been well studied in literature by several research groups in the field of complex extraction: in this way we could easily compare the results obtained through our KDSS with theirs. Our system suggests both a preprocessing and a clustering strategy, and for each of them it proposes and eventually runs suited algorithms. Our system's final results are then composed of a workflow of tasks, that can be reused for other experiments, and the specific numerical results for that particular trial. Conclusions The proposed approach, using the KDSS' knowledge base, provides a novel workflow that gives the best results with regard to the other workflows produced by the system. This workflow and its numeric results have been compared with other approaches about PPI network analysis found in literature, offering similar results. PMID:23368995

  10. Tinnitus Patient Navigator

    MedlinePlus

    ... Cure About Us Initiatives News & Events Professional Resources Tinnitus Patient Navigator Want to get started on the ... unique and may require a different treatment workflow. Tinnitus Health-Care Providers If you, or someone you ...

  11. Workspace definition for navigated control functional endoscopic sinus surgery

    NASA Astrophysics Data System (ADS)

    Gessat, Michael; Hofer, Mathias; Audette, Michael; Dietz, Andreas; Meixensberger, Jürgen; Stauß, Gero; Burgert, Oliver

    2007-03-01

    For the pre-operative definition of a surgical workspace for Navigated Control ® Functional Endoscopic Sinus Surgery (FESS), we developed a semi-automatic image processing system. Based on observations of surgeons using a manual system, we implemented a workflow-based engineering process that led us to the development of a system reducing time and workload spent during the workspace definition. The system uses a feature based on local curvature to align vertices of a polygonal outline along the bone structures defining the cavities of the inner nose. An anisotropic morphologic operator was developed solve problems arising from artifacts from noise and partial volume effects. We used time measurements and NASA's TLX questionnaire to evaluate our system.

  12. Applying operations research to optimize a novel population management system for cancer screening.

    PubMed

    Zai, Adrian H; Kim, Seokjin; Kamis, Arnold; Hung, Ken; Ronquillo, Jeremiah G; Chueh, Henry C; Atlas, Steven J

    2014-02-01

    To optimize a new visit-independent, population-based cancer screening system (TopCare) by using operations research techniques to simulate changes in patient outreach staffing levels (delegates, navigators), modifications to user workflow within the information technology (IT) system, and changes in cancer screening recommendations. TopCare was modeled as a multiserver, multiphase queueing system. Simulation experiments implemented the queueing network model following a next-event time-advance mechanism, in which systematic adjustments were made to staffing levels, IT workflow settings, and cancer screening frequency in order to assess their impact on overdue screenings per patient. TopCare reduced the average number of overdue screenings per patient from 1.17 at inception to 0.86 during simulation to 0.23 at steady state. Increases in the workforce improved the effectiveness of TopCare. In particular, increasing the delegate or navigator staff level by one person improved screening completion rates by 1.3% or 12.2%, respectively. In contrast, changes in the amount of time a patient entry stays on delegate and navigator lists had little impact on overdue screenings. Finally, lengthening the screening interval increased efficiency within TopCare by decreasing overdue screenings at the patient level, resulting in a smaller number of overdue patients needing delegates for screening and a higher fraction of screenings completed by delegates. Simulating the impact of changes in staffing, system parameters, and clinical inputs on the effectiveness and efficiency of care can inform the allocation of limited resources in population management.

  13. The MPO system for automatic workflow documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abla, G.; Coviello, E. N.; Flanagan, S. M.

    Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. Here, this article presents the Metadata, Provenance, and Ontology (MPO) System, the softwaremore » that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.« less

  14. The MPO system for automatic workflow documentation

    DOE PAGES

    Abla, G.; Coviello, E. N.; Flanagan, S. M.; ...

    2016-04-18

    Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. Here, this article presents the Metadata, Provenance, and Ontology (MPO) System, the softwaremore » that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.« less

  15. Applying operations research to optimize a novel population management system for cancer screening

    PubMed Central

    Zai, Adrian H; Kim, Seokjin; Kamis, Arnold; Hung, Ken; Ronquillo, Jeremiah G; Chueh, Henry C; Atlas, Steven J

    2014-01-01

    Objective To optimize a new visit-independent, population-based cancer screening system (TopCare) by using operations research techniques to simulate changes in patient outreach staffing levels (delegates, navigators), modifications to user workflow within the information technology (IT) system, and changes in cancer screening recommendations. Materials and methods TopCare was modeled as a multiserver, multiphase queueing system. Simulation experiments implemented the queueing network model following a next-event time-advance mechanism, in which systematic adjustments were made to staffing levels, IT workflow settings, and cancer screening frequency in order to assess their impact on overdue screenings per patient. Results TopCare reduced the average number of overdue screenings per patient from 1.17 at inception to 0.86 during simulation to 0.23 at steady state. Increases in the workforce improved the effectiveness of TopCare. In particular, increasing the delegate or navigator staff level by one person improved screening completion rates by 1.3% or 12.2%, respectively. In contrast, changes in the amount of time a patient entry stays on delegate and navigator lists had little impact on overdue screenings. Finally, lengthening the screening interval increased efficiency within TopCare by decreasing overdue screenings at the patient level, resulting in a smaller number of overdue patients needing delegates for screening and a higher fraction of screenings completed by delegates. Conclusions Simulating the impact of changes in staffing, system parameters, and clinical inputs on the effectiveness and efficiency of care can inform the allocation of limited resources in population management. PMID:24043318

  16. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.

  17. The Evolution of Computer-Assisted Total Hip Arthroplasty and Relevant Applications.

    PubMed

    Chang, Jun-Dong; Kim, In-Sung; Bhardwaj, Atul M; Badami, Ramachandra N

    2017-03-01

    In total hip arthroplasty (THA), the accurate positioning of implants is the key to achieve a good clinical outcome. Computer-assisted orthopaedic surgery (CAOS) has been developed for more accurate positioning of implants during the THA. There are passive, semi-active, and active systems in CAOS for THA. Navigation is a passive system that only provides information and guidance to the surgeon. There are 3 types of navigation: imageless navigation, computed tomography (CT)-based navigation, and fluoroscopy-based navigation. In imageless navigation system, a new method of registration without the need to register the anterior pelvic plane was introduced. CT-based navigation can be efficiently used for pelvic plane reference, the functional pelvic plane in supine which adjusts anterior pelvic plane sagittal tilt for targeting the cup orientation. Robot-assisted system can be either active or semi-active. The active robotic system performs the preparation for implant positioning as programmed preoperatively. It has been used for only femoral implant cavity preparation. Recently, program for cup positioning was additionally developed. Alternatively, for ease of surgeon acceptance, semi-active robot systems are developed. It was initially applied only for cup positioning. However, with the development of enhanced femoral workflows, this system can now be used to position both cup and stem. Though there have been substantial advancements in computer-assisted THA, its use can still be controversial at present due to the steep learning curve, intraoperative technical issues, high cost and etc. However, in the future, CAOS will certainly enable the surgeon to operate more accurately and lead to improved outcomes in THA as the technology continues to evolve rapidly.

  18. Less is sometimes more: a comparison of distance-control and navigated-control concepts of image-guided navigation support for surgeons.

    PubMed

    Luz, Maria; Manzey, Dietrich; Modemann, Susanne; Strauss, Gero

    2015-01-01

    Image-guided navigation (IGN) systems provide automation support of intra-operative information analysis and decision-making for surgeons. Previous research showed that navigated-control (NC) systems which represent high levels of decision-support and directly intervene in surgeons' workflow provide benefits with respect to patient safety and surgeons' physiological stress but also involve several cost effects (e.g. prolonged surgery duration, reduced secondary-task performance). It was hypothesised that less automated distance-control (DC) systems would provide a better solution in terms of human performance consequences. N = 18 surgeons performed a simulated mastoidectomy with NC, DC and without IGN assistance. Effects on surgical performance, physiological effort, workload and situation awareness (SA) were compared. As expected, DC technology had the same benefits as the NC system but also led to less unwanted side effects on surgery duration, subjective workload and SA. This suggests that IGN systems just providing information analysis support are overall more beneficial than higher automated decision-support. This study investigates human performance consequences of different concepts of IGN support for surgeons. Less automated DC systems turned out to provide advantages for patient safety and surgeons' stress similar to higher automated NC systems with, at the same time, reduced negative consequences on surgery time and subjective workload.

  19. Navigation concepts for MR image-guided interventions.

    PubMed

    Moche, Michael; Trampel, Robert; Kahn, Thomas; Busse, Harald

    2008-02-01

    The ongoing development of powerful magnetic resonance imaging techniques also allows for advanced possibilities to guide and control minimally invasive interventions. Various navigation concepts have been described for practically all regions of the body. The specific advantages and limitations of these concepts largely depend on the magnet design of the MR scanner and the interventional environment. Open MR scanners involve minimal patient transfer, which improves the interventional workflow and reduces the need for coregistration, ie, the mapping of spatial coordinates between imaging and intervention position. Most diagnostic scanners, in contrast, do not allow the physician to guide his instrument inside the magnet and, consequently, the patient needs to be moved out of the bore. Although adequate coregistration and navigation concepts for closed-bore scanners are technically more challenging, many developments are driven by the well-known capabilities of high-field systems and their better economic value. Advanced concepts such as multimodal overlays, augmented reality displays, and robotic assistance devices are still in their infancy but might propel the use of intraoperative navigation. The goal of this work is to give an update on MRI-based navigation and related techniques and to briefly discuss the clinical experience and limitations of some selected systems. (Copyright) 2008 Wiley-Liss, Inc.

  20. A hybrid image fusion system for endovascular interventions of peripheral artery disease.

    PubMed

    Lalys, Florent; Favre, Ketty; Villena, Alexandre; Durrmann, Vincent; Colleaux, Mathieu; Lucas, Antoine; Kaladji, Adrien

    2018-07-01

    Interventional endovascular treatment has become the first line of management in the treatment of peripheral artery disease (PAD). However, contrast and radiation exposure continue to limit the feasibility of these procedures. This paper presents a novel hybrid image fusion system for endovascular intervention of PAD. We present two different roadmapping methods from intra- and pre-interventional imaging that can be used either simultaneously or independently, constituting the navigation system. The navigation system is decomposed into several steps that can be entirely integrated within the procedure workflow without modifying it to benefit from the roadmapping. First, a 2D panorama of the entire peripheral artery system is automatically created based on a sequence of stepping fluoroscopic images acquired during the intra-interventional diagnosis phase. During the interventional phase, the live image can be synchronized on the panorama to form the basis of the image fusion system. Two types of augmented information are then integrated. First, an angiography panorama is proposed to avoid contrast media re-injection. Information exploiting the pre-interventional computed tomography angiography (CTA) is also brought to the surgeon by means of semiautomatic 3D/2D registration on the 2D panorama. Each step of the workflow was independently validated. Experiments for both the 2D panorama creation and the synchronization processes showed very accurate results (errors of 1.24 and [Formula: see text] mm, respectively), similarly to the registration on the 3D CTA (errors of [Formula: see text] mm), with minimal user interaction and very low computation time. First results of an on-going clinical study highlighted its major clinical added value on intraoperative parameters. No image fusion system has been proposed yet for endovascular procedures of PAD in lower extremities. More globally, such a navigation system, combining image fusion from different 2D and 3D image sources, is novel in the field of endovascular procedures.

  1. The Evolution of Computer-Assisted Total Hip Arthroplasty and Relevant Applications

    PubMed Central

    Kim, In-Sung; Bhardwaj, Atul M.; Badami, Ramachandra N.

    2017-01-01

    In total hip arthroplasty (THA), the accurate positioning of implants is the key to achieve a good clinical outcome. Computer-assisted orthopaedic surgery (CAOS) has been developed for more accurate positioning of implants during the THA. There are passive, semi-active, and active systems in CAOS for THA. Navigation is a passive system that only provides information and guidance to the surgeon. There are 3 types of navigation: imageless navigation, computed tomography (CT)-based navigation, and fluoroscopy-based navigation. In imageless navigation system, a new method of registration without the need to register the anterior pelvic plane was introduced. CT-based navigation can be efficiently used for pelvic plane reference, the functional pelvic plane in supine which adjusts anterior pelvic plane sagittal tilt for targeting the cup orientation. Robot-assisted system can be either active or semi-active. The active robotic system performs the preparation for implant positioning as programmed preoperatively. It has been used for only femoral implant cavity preparation. Recently, program for cup positioning was additionally developed. Alternatively, for ease of surgeon acceptance, semi-active robot systems are developed. It was initially applied only for cup positioning. However, with the development of enhanced femoral workflows, this system can now be used to position both cup and stem. Though there have been substantial advancements in computer-assisted THA, its use can still be controversial at present due to the steep learning curve, intraoperative technical issues, high cost and etc. However, in the future, CAOS will certainly enable the surgeon to operate more accurately and lead to improved outcomes in THA as the technology continues to evolve rapidly. PMID:28316957

  2. a Schema for Extraction of Indoor Pedestrian Navigation Grid Network from Floor Plans

    NASA Astrophysics Data System (ADS)

    Niu, Lei; Song, Yiquan

    2016-06-01

    The requirement of the indoor navigation related tasks such emergency evacuation calls for efficient solutions for handling data sources. Therefore, the navigation grid extraction from existing floor plans draws attentions. To this, we have to thoroughly analyse the source data, such as Autocad dxf files. Then, we could establish a sounding navigation solution, which firstly complements the basic navigation rectangle boundaries, secondly subdivides these rectangles and finally generates accessible networks with these refined rectangles. Test files are introduced to validate the whole workflow and evaluate the solution performance. In conclusion, we have achieved the preliminary step of forming up accessible network from the navigation grids.

  3. Experiments on robot-assisted navigated drilling and milling of bones for pedicle screw placement.

    PubMed

    Ortmaier, T; Weiss, H; Döbele, S; Schreiber, U

    2006-12-01

    This article presents experimental results for robot-assisted navigated drilling and milling for pedicle screw placement. The preliminary study was carried out in order to gain first insights into positioning accuracies and machining forces during hands-on robotic spine surgery. Additionally, the results formed the basis for the development of a new robot for surgery. A simplified anatomical model is used to derive the accuracy requirements. The experimental set-up consists of a navigation system and an impedance-controlled light-weight robot holding the surgical instrument. The navigation system is used to position the surgical instrument and to compensate for pose errors during machining. Holes are drilled in artificial bone and bovine spine. A quantitative comparison of the drill-hole diameters was achieved using a computer. The interaction forces and pose errors are discussed with respect to the chosen machining technology and control parameters. Within the technological boundaries of the experimental set-up, it is shown that the accuracy requirements can be met and that milling is superior to drilling. It is expected that robot assisted navigated surgery helps to improve the reliability of surgical procedures. Further experiments are necessary to take the whole workflow into account. Copyright 2006 John Wiley & Sons, Ltd.

  4. Virtual Preoperative Planning and Intraoperative Navigation in Facial Prosthetic Reconstruction: A Technical Note.

    PubMed

    Verma, Suzanne; Gonzalez, Marianela; Schow, Sterling R; Triplett, R Gilbert

    This technical protocol outlines the use of computer-assisted image-guided technology for the preoperative planning and intraoperative procedures involved in implant-retained facial prosthetic treatment. A contributing factor for a successful prosthetic restoration is accurate preoperative planning to identify prosthetically driven implant locations that maximize bone contact and enhance cosmetic outcomes. Navigational systems virtually transfer precise digital planning into the operative field for placing implants to support prosthetic restorations. In this protocol, there is no need to construct a physical, and sometimes inaccurate, surgical guide. The report addresses treatment workflow, radiologic data specifications, and special considerations in data acquisition, virtual preoperative planning, and intraoperative navigation for the prosthetic reconstruction of unilateral, bilateral, and midface defects. Utilization of this protocol for the planning and surgical placement of craniofacial bone-anchored implants allows positioning of implants to be prosthetically driven, accurate, precise, and efficient, and leads to a more predictable treatment outcome.

  5. How I do it-optimizing radiofrequency ablation in spinal metastases using iCT and navigation.

    PubMed

    Kavakebi, Pujan; Freyschlag, C F; Thomé, C

    2017-10-01

    Exact positioning of the radiofrequency ablation (RFA) probe for tumor treatment under fluoroscopic guidance can be difficult because of potentially small inaccessible lesions and the radiation dose to the medical staff in RFA. In addition, vertebroplasty (VP) can be significantly high. Description and workflow of RFA in spinal metastasis using iCT (intraoperative computed tomography) and 3D-navigation-based probe placement followed by VP. RFA and VP can be successfully combined with iCT-based navigation, which leads to a reduction of radiation to the staff and optimal probe positioning due to 3D navigation.

  6. Physician activity during outpatient visits and subjective workload.

    PubMed

    Calvitti, Alan; Hochheiser, Harry; Ashfaq, Shazia; Bell, Kristin; Chen, Yunan; El Kareh, Robert; Gabuzda, Mark T; Liu, Lin; Mortensen, Sara; Pandey, Braj; Rick, Steven; Street, Richard L; Weibel, Nadir; Weir, Charlene; Agha, Zia

    2017-05-01

    We describe methods for capturing and analyzing EHR use and clinical workflow of physicians during outpatient encounters and relating activity to physicians' self-reported workload. We collected temporally-resolved activity data including audio, video, EHR activity, and eye-gaze along with post-visit assessments of workload. These data are then analyzed through a combination of manual content analysis and computational techniques to temporally align streams, providing a range of process measures of EHR usage, clinical workflow, and physician-patient communication. Data was collected from primary care and specialty clinics at the Veterans Administration San Diego Healthcare System and UCSD Health, who use Electronic Health Record (EHR) platforms, CPRS and Epic, respectively. Grouping visit activity by physician, site, specialty, and patient status enables rank-ordering activity factors by their correlation to physicians' subjective work-load as captured by NASA Task Load Index survey. We developed a coding scheme that enabled us to compare timing studies between CPRS and Epic and extract patient and visit complexity profiles. We identified similar patterns of EHR use and navigation at the 2 sites despite differences in functions, user interfaces and consequent coded representations. Both sites displayed similar proportions of EHR function use and navigation, and distribution of visit length, proportion of time physicians attended to EHRs (gaze), and subjective work-load as measured by the task load survey. We found that visit activity was highly variable across individual physicians, and the observed activity metrics ranged widely as correlates to subjective workload. We discuss implications of our study for methodology, clinical workflow and EHR redesign. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. From Panoramic Photos to a Low-Cost Photogrammetric Workflow for Cultural Heritage 3d Documentation

    NASA Astrophysics Data System (ADS)

    D'Annibale, E.; Tassetti, A. N.; Malinverni, E. S.

    2013-07-01

    The research aims to optimize a workflow of architecture documentation: starting from panoramic photos, tackling available instruments and technologies to propose an integrated, quick and low-cost solution of Virtual Architecture. The broader research background shows how to use spherical panoramic images for the architectural metric survey. The input data (oriented panoramic photos), the level of reliability and Image-based Modeling methods constitute an integrated and flexible 3D reconstruction approach: from the professional survey of cultural heritage to its communication in virtual museum. The proposed work results from the integration and implementation of different techniques (Multi-Image Spherical Photogrammetry, Structure from Motion, Imagebased Modeling) with the aim to achieve high metric accuracy and photorealistic performance. Different documentation chances are possible within the proposed workflow: from the virtual navigation of spherical panoramas to complex solutions of simulation and virtual reconstruction. VR tools make for the integration of different technologies and the development of new solutions for virtual navigation. Image-based Modeling techniques allow 3D model reconstruction with photo realistic and high-resolution texture. High resolution of panoramic photo and algorithms of panorama orientation and photogrammetric restitution vouch high accuracy and high-resolution texture. Automated techniques and their following integration are subject of this research. Data, advisably processed and integrated, provide different levels of analysis and virtual reconstruction joining the photogrammetric accuracy to the photorealistic performance of the shaped surfaces. Lastly, a new solution of virtual navigation is tested. Inside the same environment, it proposes the chance to interact with high resolution oriented spherical panorama and 3D reconstructed model at once.

  8. MACS-Mar: a real-time remote sensing system for maritime security applications

    NASA Astrophysics Data System (ADS)

    Brauchle, Jörg; Bayer, Steven; Hein, Daniel; Berger, Ralf; Pless, Sebastian

    2018-04-01

    The modular aerial camera system (MACS) is a development platform for optical remote sensing concepts, algorithms and special environments. For real-time services for maritime security (EMSec joint project), a new multi-sensor configuration MACS-Mar was realized. It consists of four co-aligned sensor heads in the visible RGB, near infrared (NIR, 700-950 nm), hyperspectral (HS, 450-900 nm) and thermal infrared (TIR, 7.5-14 µm) spectral range, a mid-cost navigation system, a processing unit and two data links. On-board image projection, cropping of redundant data and compression enable the instant generation of direct-georeferenced high-resolution image mosaics, automatic object detection, vectorization and annotation of floating objects on the water surface. The results were transmitted over a distance up to 50 km in real-time via narrow and broadband data links and were visualized in a maritime situation awareness system. For the automatic onboard detection of floating objects, a segmentation and classification workflow based on RGB, IR and TIR information was developed and tested. The completeness of the object detection in the experiment resulted in 95%, the correctness in 53%. Mostly, bright backwash of ships lead to an overestimation of the number of objects, further refinement using water homogeneity in the TIR, as implemented in the workflow, couldn't be carried out due to problems with the TIR sensor, else distinctly better results could have been expected. The absolute positional accuracy of the projected real-time imagery resulted in 2 m without postprocessing of images or navigation data, the relative measurement accuracy of distances is in the range of the image resolution, which is about 12 cm for RGB imagery in the EMSec experiment.

  9. Intra-operative fiducial-based CT/fluoroscope image registration framework for image-guided robot-assisted joint fracture surgery.

    PubMed

    Dagnino, Giulio; Georgilas, Ioannis; Morad, Samir; Gibbons, Peter; Tarassoli, Payam; Atkins, Roger; Dogramadzi, Sanja

    2017-08-01

    Joint fractures must be accurately reduced minimising soft tissue damages to avoid negative surgical outcomes. To this regard, we have developed the RAFS surgical system, which allows the percutaneous reduction of intra-articular fractures and provides intra-operative real-time 3D image guidance to the surgeon. Earlier experiments showed the effectiveness of the RAFS system on phantoms, but also key issues which precluded its use in a clinical application. This work proposes a redesign of the RAFS's navigation system overcoming the earlier version's issues, aiming to move the RAFS system into a surgical environment. The navigation system is improved through an image registration framework allowing the intra-operative registration between pre-operative CT images and intra-operative fluoroscopic images of a fractured bone using a custom-made fiducial marker. The objective of the registration is to estimate the relative pose between a bone fragment and an orthopaedic manipulation pin inserted into it intra-operatively. The actual pose of the bone fragment can be updated in real time using an optical tracker, enabling the image guidance. Experiments on phantom and cadavers demonstrated the accuracy and reliability of the registration framework, showing a reduction accuracy (sTRE) of about [Formula: see text] (phantom) and [Formula: see text] (cadavers). Four distal femur fractures were successfully reduced in cadaveric specimens using the improved navigation system and the RAFS system following the new clinical workflow (reduction error [Formula: see text], [Formula: see text]. Experiments showed the feasibility of the image registration framework. It was successfully integrated into the navigation system, allowing the use of the RAFS system in a realistic surgical application.

  10. Editorial: Challenges for the usability of AR and VR for clinical neurosurgical procedures.

    PubMed

    de Ribaupierre, Sandrine; Eagleson, Roy

    2017-10-01

    There are a number of challenges that must be faced when trying to develop AR and VR-based Neurosurgical simulators, Surgical Navigation Platforms, and "Smart OR" systems. Trying to simulate an operating room environment and surgical tasks in Augmented and Virtual Reality is a challenge many are attempting to solve, in order to train surgeons or help them operate. What are some of the needs of the surgeon, and what are the challenges encountered (human computer interface, perception, workflow, etc). We discuss these tradeoffs and conclude with critical remarks.

  11. National Positioning, Navigation, and Timing Architecture Study [Executive Summary

    DOT National Transportation Integrated Search

    2008-09-01

    This primary objective of this study is to develop a comprehensive workflow to guide Oklahoma DOT to apply the LIDAR technology to landslide monitoring and risk assessment on Oklahoma highways. In addition to the primary objective, this study also ai...

  12. SU-F-P-42: “To Navigate, Or Not to Navigate: HDR BT in Recurrent Spine Lesions”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voros, L; Cohen, G; Zaider, M

    Purpose: We compare the accuracy of HDR catheter placement for paraspinal lesions using O-arm CBCT imaging combined with StealthStation navigation and traditional fluoroscopically guided catheter placement. Methods: CT and MRI scans were acquired pre-treatment to outline the lesions and design treatment plans (pre-plans) to meet dosimetric constrains. The pre-planned catheter trajectories were transferred into the StealthStation Navigation system prior to the surgery. The StealthStation is an infra red (IR) optical navigation system used for guidance of surgical instruments. An intraoperative CBCT scan (O-arm) was acquired with reference IR optical fiducials anchored onto the patient and registered with the preplan imagemore » study to guide surgical instruments in relation to the patients’ anatomy and to place the brachytherapy catheters along the pre-planned trajectories. The final treatment plan was generated based on a 2nd intraoperative CBCT scan reflecting achieved implant geometry. The 2nd CBCT was later registered with the initial CT scan to compare the preplanned dwell positions with actual dwell positions (catheter placements). Similar workflow was used in placement of 8 catheters (1 patient) without navigation, but under fluoroscopy guidance in an interventional radiology suite. Results: A total of 18 catheters (3 patients) were placed using navigation assisted surgery. Average displacement of 0.66 cm (STD=0.37cm) was observed between the pre-plan source positions and actual source positions in the 3 dimensional space. This translates into an average 0.38 cm positioning error in one direction including registration errors, digitization errors, and the surgeons ability to follow the planned trajectory. In comparison, average displacement of non-navigated catheters was 0.50 cm (STD=0.22cm). Conclusion: Spinal lesion HDR brachytherapy planning is a difficult task. Catheter placement has a direct impact on target coverage and dose to critical structures. While limited to a handful of patients, our experience shows navigation and fluoroscopy guided placement yield similar results.« less

  13. Usability Testing of a National Substance Use Screening Tool Embedded in Electronic Health Records.

    PubMed

    Press, Anne; DeStio, Catherine; McCullagh, Lauren; Kapoor, Sandeep; Morley, Jeanne; Conigliaro, Joseph

    2016-07-08

    Screening, brief intervention, and referral to treatment (SBIRT) is currently being implemented into health systems nationally via paper and electronic methods. The purpose of this study was to evaluate the integration of an electronic SBIRT tool into an existing paper-based SBIRT clinical workflow in a patient-centered medical home. Usability testing was conducted in an academic ambulatory clinic. Two rounds of usability testing were done with medical office assistants (MOAs) using a paper and electronic version of the SBIRT tool, with two and four participants, respectively. Qualitative and quantitative data was analyzed to determine the impact of both tools on clinical workflow. A second round of usability testing was done with the revised electronic version and compared with the first version. Personal workflow barriers cited in the first round of testing were that the electronic health record (EHR) tool was disruptive to patient's visits. In Round 2 of testing, MOAs reported favoring the electronic version due to improved layout and the inclusion of an alert system embedded in the EHR. For example, using the system usability scale (SUS), MOAs reported a grade "1" for the statement, "I would like to use this system frequently" during the first round of testing but a "5" during the second round of analysis. The importance of testing usability of various mediums of tools used in health care screening is highlighted by the findings of this study. In the first round of testing, the electronic tool was reported as less user friendly, being difficult to navigate, and time consuming. Many issues faced in the first generation of the tool were improved in the second generation after usability was evaluated. This study demonstrates how usability testing of an electronic SBRIT tool can help to identify challenges that can impact clinical workflow. However, a limitation of this study was the small sample size of MOAs that participated. The results may have been biased to Northwell Health workers' perceptions of the SBIRT tool and their specific clinical workflow.

  14. Techniques for Interventional MRI Guidance in Closed-Bore Systems.

    PubMed

    Busse, Harald; Kahn, Thomas; Moche, Michael

    2018-02-01

    Efficient image guidance is the basis for minimally invasive interventions. In comparison with X-ray, computed tomography (CT), or ultrasound imaging, magnetic resonance imaging (MRI) provides the best soft tissue contrast without ionizing radiation and is therefore predestined for procedural control. But MRI is also characterized by spatial constraints, electromagnetic interactions, long imaging times, and resulting workflow issues. Although many technical requirements have been met over the years-most notably magnetic resonance (MR) compatibility of tools, interventional pulse sequences, and powerful processing hardware and software-there is still a large variety of stand-alone devices and systems for specific procedures only.Stereotactic guidance with the table outside the magnet is common and relies on proper registration of the guiding grids or manipulators to the MR images. Instrument tracking, often by optical sensing, can be added to provide the physicians with proper eye-hand coordination during their navigated approach. Only in very short wide-bore systems, needles can be advanced at the extended arm under near real-time imaging. In standard magnets, control and workflow may be improved by remote operation using robotic or manual driving elements.This work highlights a number of devices and techniques for different interventional settings with a focus on percutaneous, interstitial procedures in different organ regions. The goal is to identify technical and procedural elements that might be relevant for interventional guidance in a broader context, independent of the clinical application given here. Key challenges remain the seamless integration into the interventional workflow, safe clinical translation, and proper cost effectiveness.

  15. Navigation and Image Injection for Control of Bone Removal and Osteotomy Planes in Spine Surgery.

    PubMed

    Kosterhon, Michael; Gutenberg, Angelika; Kantelhardt, Sven Rainer; Archavlis, Elefterios; Giese, Alf

    2017-04-01

    In contrast to cranial interventions, neuronavigation in spinal surgery is used in few applications, not tapping into its full technological potential. We have developed a method to preoperatively create virtual resection planes and volumes for spinal osteotomies and export 3-D operation plans to a navigation system controlling intraoperative visualization using a surgical microscope's head-up display. The method was developed using a Sawbone ® model of the lumbar spine, demonstrating feasibility with high precision. Computer tomographic and magnetic resonance image data were imported into Amira ® , a 3-D visualization software. Resection planes were positioned, and resection volumes representing intraoperative bone removal were defined. Fused to the original Digital Imaging and Communications in Medicine data, the osteotomy planes were exported to the cranial version of a Brainlab ® navigation system. A navigated surgical microscope with video connection to the navigation system allowed intraoperative image injection to visualize the preplanned resection planes. The workflow was applied to a patient presenting with a congenital hemivertebra of the thoracolumbar spine. Dorsal instrumentation with pedicle screws and rods was followed by resection of the deformed vertebra guided by the in-view image injection of the preplanned resection planes into the optical path of a surgical microscope. Postoperatively, the patient showed no neurological deficits, and the spine was found to be restored in near physiological posture. The intraoperative visualization of resection planes in a microscope's head-up display was found to assist the surgeon during the resection of a complex-shaped bone wedge and may help to further increase accuracy and patient safety. Copyright © 2017 by the Congress of Neurological Surgeons

  16. Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping

    NASA Astrophysics Data System (ADS)

    B. Mondal, Suman; Gao, Shengkui; Zhu, Nan; Sudlow, Gail P.; Liang, Kexian; Som, Avik; Akers, Walter J.; Fields, Ryan C.; Margenthaler, Julie; Liang, Rongguang; Gruev, Viktor; Achilefu, Samuel

    2015-07-01

    The inability to identify microscopic tumors and assess surgical margins in real-time during oncologic surgery leads to incomplete tumor removal, increases the chances of tumor recurrence, and necessitates costly repeat surgery. To overcome these challenges, we have developed a wearable goggle augmented imaging and navigation system (GAINS) that can provide accurate intraoperative visualization of tumors and sentinel lymph nodes in real-time without disrupting normal surgical workflow. GAINS projects both near-infrared fluorescence from tumors and the natural color images of tissue onto a head-mounted display without latency. Aided by tumor-targeted contrast agents, the system detected tumors in subcutaneous and metastatic mouse models with high accuracy (sensitivity = 100%, specificity = 98% ± 5% standard deviation). Human pilot studies in breast cancer and melanoma patients using a near-infrared dye show that the GAINS detected sentinel lymph nodes with 100% sensitivity. Clinical use of the GAINS to guide tumor resection and sentinel lymph node mapping promises to improve surgical outcomes, reduce rates of repeat surgery, and improve the accuracy of cancer staging.

  17. Navigation concepts for magnetic resonance imaging-guided musculoskeletal interventions.

    PubMed

    Busse, Harald; Kahn, Thomas; Moche, Michael

    2011-08-01

    Image-guided musculoskeletal (MSK) interventions are a widely used alternative to open surgical procedures for various pathological findings in different body regions. They traditionally involve one of the established x-ray imaging techniques (radiography, fluoroscopy, computed tomography) or ultrasound scanning. Over the last decades, magnetic resonance imaging (MRI) has evolved into one of the most powerful diagnostic tools for nearly the whole body and has therefore been increasingly considered for interventional guidance as well.The strength of MRI for MSK applications is a combination of well-known general advantages, such as multiplanar and functional imaging capabilities, wide choice of tissue contrasts, and absence of ionizing radiation, as well as a number of MSK-specific factors, for example, the excellent depiction of soft-tissue tumors, nonosteolytic bone changes, and bone marrow lesions. On the downside, the magnetic resonance-compatible equipment needed, restricted space in the magnet, longer imaging times, and the more complex workflow have so far limited the number of MSK procedures under MRI guidance.Navigation solutions are generally a natural extension of any interventional imaging system, in particular, because powerful hardware and software for image processing have become routinely available. They help to identify proper access paths, provide accurate feedback on the instrument positions, facilitate the workflow in an MRI environment, and ultimately contribute to procedural safety and success.The purposes of this work were to describe some basic concepts and devices for MRI guidance of MSK procedures and to discuss technical and clinical achievements and challenges for some selected implementations.

  18. A networked modular hardware and software system for MRI-guided robotic prostate interventions

    NASA Astrophysics Data System (ADS)

    Su, Hao; Shang, Weijian; Harrington, Kevin; Camilo, Alex; Cole, Gregory; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare; Fischer, Gregory S.

    2012-02-01

    Magnetic resonance imaging (MRI) provides high resolution multi-parametric imaging, large soft tissue contrast, and interactive image updates making it an ideal modality for diagnosing prostate cancer and guiding surgical tools. Despite a substantial armamentarium of apparatuses and systems has been developed to assist surgical diagnosis and therapy for MRI-guided procedures over last decade, the unified method to develop high fidelity robotic systems in terms of accuracy, dynamic performance, size, robustness and modularity, to work inside close-bore MRI scanner still remains a challenge. In this work, we develop and evaluate an integrated modular hardware and software system to support the surgical workflow of intra-operative MRI, with percutaneous prostate intervention as an illustrative case. Specifically, the distinct apparatuses and methods include: 1) a robot controller system for precision closed loop control of piezoelectric motors, 2) a robot control interface software that connects the 3D Slicer navigation software and the robot controller to exchange robot commands and coordinates using the OpenIGTLink open network communication protocol, and 3) MRI scan plane alignment to the planned path and imaging of the needle as it is inserted into the target location. A preliminary experiment with ex-vivo phantom validates the system workflow, MRI-compatibility and shows that the robotic system has a better than 0.01mm positioning accuracy.

  19. First clinical use of the EchoTrack guidance approach for radiofrequency ablation of thyroid gland nodules.

    PubMed

    Franz, Alfred Michael; Seitel, Alexander; Bopp, Nasrin; Erbelding, Christian; Cheray, Dominique; Delorme, Stefan; Grünwald, Frank; Korkusuz, Hüdayi; Maier-Hein, Lena

    2017-06-01

    Percutaneous radiofrequency ablation (RFA) of thyroid nodules is an alternative to surgical resection that offers the benefits of minimal scars for the patient, lower complication rates, and shorter treatment times. Ultrasound (US) is the preferred modality for guiding these procedures. The needle is usually kept within the US scanning plane to ensure needle visibility. However, this restricts flexibility in both transducer and needle movement and renders the procedure difficult, especially for inexperienced users. Existing navigation solutions often involve electromagnetic (EM) tracking, which requires placement of an external field generator (FG) in close proximity of the intervention site in order to avoid distortion of the EM field. This complicates the clinical workflow as placing the FG while ensuring that it neither restricts the physician's workspace nor affects tracking accuracy is awkward and time-consuming. The EchoTrack concept overcomes these issues by combining the US probe and the EM FG in one modality, simultaneously providing both real-time US and tracking data without requiring the placement of an external FG for tracking. We propose a system and workflow to use EchoTrack for RFA of thyroid nodules. According to our results, the overall error of the EchoTrack system resulting from errors related to tracking and calibration is below 2 mm. Navigated thyroid RFA with the proposed concept is clinically feasible. Motion of internal critical structures relative to external markers can be up to several millimeters in extreme cases. The EchoTrack concept with its simple setup, flexibility, improved needle visualization, and additional guidance information has high potential to be clinically used for thyroid RFA.

  20. Intelligent ePR system for evidence-based research in radiotherapy: proton therapy for prostate cancer.

    PubMed

    Le, Anh H; Liu, Brent; Schulte, Reinhard; Huang, H K

    2011-11-01

    Proton therapy (PT) utilizes high energy particle proton beam to kill cancer cells at the target region for target cancer therapy. Due to the physical properties of the proton beam, PT delivers dose with higher precision and no exit dose compared to conventional radiotherapy. In PT, patient data are distributed among multiple systems, a hindrance to research on efficacy and effectiveness. A data mining method and a treatment plan navigator utilizing the infrastructure and data repository of a PT electronic patient record (ePR) was developed to minimize radiation toxicity and improve outcomes in prostate cancer treatment. MATERIALS/METHOD(S): The workflow of a proton therapy treatment in a radiation oncology department was reviewed, and a clinical data model and data flow were designed. A prototype PT ePR system with DICOM compliance was developed to manage prostate cancer patient images, treatment plans, and related clinical data. The ePR system consists of four main components: (1) Data Gateway; (2) ePR Server; (3) Decision Support Tools; and (4) Visualization and Display Tools. Decision support and visualization tools are currently developed based on DICOM images, DICOM-RT and DICOM-RT-ION objects data from prostate cancer patients treated with hypofractionation protocol proton therapy were used for evaluating ePR system effectiveness. Each patient data set includes a set of computed tomography (CT) DICOM images and four DICOM-RT and RT-ION objects. In addition, clinical outcomes data collected from PT cases were included to establish a knowledge base for outcomes analysis. A data mining search engine and an intelligent treatment plan navigator (ITPN) were developed and integrated with the ePR system. Evaluation was based on a data set of 39 PT patients and a hypothetical patient. The ePR system was able to facilitate the proton therapy workflow. The PT ePR system was feasible for prostate cancer patient treated with hypofractionation protocol in proton therapy. This ePR system improves efficiency in data collection and integration to facilitate outcomes analysis.

  1. Workflow management systems in radiology

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  2. Challenges and issues of geolocation in clinical environment.

    PubMed

    Issom, David-Zacharie; Hagry, Claire; Wodia Mendo, Laetitia; Seng, Henry; Ehrler, Frederic; Lovis, Christian

    2012-01-01

    Reaching a good indoor geolocation without deploying extensive and expensive infrastructure is a challenge, because satellite positioning system is not available indoors. Geolocation could be of major use in healthcare facilities; to help care providers, visitors and patients to navigate, to improve movements and flows efficiency or to implement location-awareness systems. A system able to provide the location of a person in a hospital requires precision, multi-floors and obstacles management and should also perform in basements and outdoors. Such system needs also to be insensitive to environmental variations occurring in a hospital. These changes may be various kinds of obstacles. These can be the displacement of metallic objects, metallic machines, strong magnetic fields or simply human displacement. A system conforming to the above requirements can also answer various security questions, operational workflow management but also assist movement of people.

  3. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  4. Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping

    PubMed Central

    B. Mondal, Suman; Gao, Shengkui; Zhu, Nan; Sudlow, Gail P.; Liang, Kexian; Som, Avik; Akers, Walter J.; Fields, Ryan C.; Margenthaler, Julie; Liang, Rongguang; Gruev, Viktor; Achilefu, Samuel

    2015-01-01

    The inability to identify microscopic tumors and assess surgical margins in real-time during oncologic surgery leads to incomplete tumor removal, increases the chances of tumor recurrence, and necessitates costly repeat surgery. To overcome these challenges, we have developed a wearable goggle augmented imaging and navigation system (GAINS) that can provide accurate intraoperative visualization of tumors and sentinel lymph nodes in real-time without disrupting normal surgical workflow. GAINS projects both near-infrared fluorescence from tumors and the natural color images of tissue onto a head-mounted display without latency. Aided by tumor-targeted contrast agents, the system detected tumors in subcutaneous and metastatic mouse models with high accuracy (sensitivity = 100%, specificity = 98% ± 5% standard deviation). Human pilot studies in breast cancer and melanoma patients using a near-infrared dye show that the GAINS detected sentinel lymph nodes with 100% sensitivity. Clinical use of the GAINS to guide tumor resection and sentinel lymph node mapping promises to improve surgical outcomes, reduce rates of repeat surgery, and improve the accuracy of cancer staging. PMID:26179014

  5. SMITH: a LIMS for handling next-generation sequencing workflows

    PubMed Central

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. Conclusions SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis. PMID:25471934

  6. SMITH: a LIMS for handling next-generation sequencing workflows.

    PubMed

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis.

  7. Bioinformatics workflows and web services in systems biology made easy for experimentalists.

    PubMed

    Jimenez, Rafael C; Corpas, Manuel

    2013-01-01

    Workflows are useful to perform data analysis and integration in systems biology. Workflow management systems can help users create workflows without any previous knowledge in programming and web services. However the computational skills required to build such workflows are usually above the level most biological experimentalists are comfortable with. In this chapter we introduce workflow management systems that reuse existing workflows instead of creating them, making it easier for experimentalists to perform computational tasks.

  8. Kwf-Grid workflow management system for Earth science applications

    NASA Astrophysics Data System (ADS)

    Tran, V.; Hluchy, L.

    2009-04-01

    In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.

  9. Automated metadata, provenance cataloging and navigable interfaces: ensuring the usefulness of extreme-scale data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David; Greenwald, Martin

    The MPO (Metadata, Provenance, Ontology) Project successfully addressed the goal of improving the usefulness and traceability of scientific data by building a system that could capture and display all steps in the process of creating, analyzing and disseminating that data. Throughout history, scientists have generated handwritten logbooks to keep track of data, their hypotheses, assumptions, experimental setup, and computational processes as well as reflections on observations and issues encountered. Over the last several decades, with the growth of personal computers, handheld devices, and the World Wide Web, the handwritten logbook has begun to be replaced by electronic logbooks. This transitionmore » has brought increased capability such as supporting multi-media, hypertext, and fast searching. However, content creation and metadata (a set of data that describes and gives information about other data) capturing has for the most part remained a manual activity just as it was with handwritten logbooks. This has led to a fragmentation of data, processing, and annotation that has only accelerated as scientific workflows continue to increase in complexity. From a scientific perspective, it is very important to be able to understand the lineage of any piece of data: who, what, when, how, and why. This is typically referred to as data provenance. The fragmentation discussed previously often means that data provenance is lost. As scientific workflows move to powerful computers and become more complex, the ability to track all of the steps involved in creating a piece of data become even more difficult. It was the goal of the MPO (Metadata, Provenance, Ontology) Project to create a system (the MPO System) that allows for automatic provenance and metadata capturing in such a way to allow easy searching and browsing. This goal needed to be accomplished in a general way so that it may be used across a broad range of scientific domains, yet allow the addition of vocabulary (Ontology) that is domain specific as is required for intelligent searching and browsing in the scientific context. Through the creation and deployment of the MPO system, the goals of the project were achieved. An enhanced metadata, provenance, and ontology storage system was created. This was combined with innovative methodologies for navigating and exploring these data using a web browser for both experimental and simulation-based scientific research. In addition, a system to allow scientists to instrument their existing workflows for automatic metadata and provenance is part of the MPO system. In that way, a scientist can continue to use their existing methodology yet easily document their work. Workflows and data provenance can be displayed either graphically or in an electronic notebook format and support advanced search features including via ontology. The MPO system was successfully used in both Climate and Magnetic Fusion Energy Research. The software for the MPO system is located at https://github.com/MPO-Group/MPO and is open source distributed under the Revised BSD License. A demonstration site of the MPO system is open to the public and is available at https://mpo.psfc.mit.edu/. A Docker container release of the command line client is available for public download using the command docker pull jcwright/mpo-cli at https://hub.docker.com/r/jcwright/mpo-cli.« less

  10. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    PubMed

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org.

  11. National Centers for Environmental Prediction

    Science.gov Websites

    Weather Service NWS logo - Click to go to the NWS homepage EMC Logo Navigation Bar Left Cap Home News TRAINING AND DOCUMENTATION MATERIALS: * Workflow Manager (7/24/12) * Zeus Advanced Group B Training Agenda (4/17/12) * Zeus Quickstart Training * NESCC HPC Group A End User Training (2/16/12) * Adaptive

  12. A Technology Solution Strengthens Comprehensive Environmental Management

    DTIC Science & Technology

    2012-05-23

    General Navigation  Chemical Approval Example  NEPA Coordination Example  Safety PPE Example  Summary Marine Corps Support Facility...coordination, completion and documentation through automated workflows of various business processes  Chemical Approval  NEPA Coordination  Safety ...Completion Diagram Government Employee/M CMC MCMC Chemical Manager MCMC HS&E Specialist IMO Chemical Safety Specialist IMO Chemical Environmental

  13. 3D workflow for HDR image capture of projection systems and objects for CAVE virtual environments authoring with wireless touch-sensitive devices

    NASA Astrophysics Data System (ADS)

    Prusten, Mark J.; McIntyre, Michelle; Landis, Marvin

    2006-02-01

    A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.

  14. Indoor A* Pathfinding Through an Octree Representation of a Point Cloud

    NASA Astrophysics Data System (ADS)

    Rodenberg, O. B. P. M.; Verbree, E.; Zlatanova, S.

    2016-10-01

    There is a growing demand of 3D indoor pathfinding applications. Researched in the field of robotics during the last decades of the 20th century, these methods focussed on 2D navigation. Nowadays we would like to have the ability to help people navigate inside buildings or send a drone inside a building when this is too dangerous for people. What these examples have in common is that an object with a certain geometry needs to find an optimal collision free path between a start and goal point. This paper presents a new workflow for pathfinding through an octree representation of a point cloud. We applied the following steps: 1) the point cloud is processed so it fits best in an octree; 2) during the octree generation the interior empty nodes are filtered and further processed; 3) for each interior empty node the distance to the closest occupied node directly under it is computed; 4) a network graph is computed for all empty nodes; 5) the A* pathfinding algorithm is conducted. This workflow takes into account the connectivity for each node to all possible neighbours (face, edge and vertex and all sizes). Besides, a collision avoidance system is pre-processed in two steps: first, the clearance of each empty node is computed, and then the maximal crossing value between two empty neighbouring nodes is computed. The clearance is used to select interior empty nodes of appropriate size and the maximal crossing value is used to filter the network graph. Finally, both these datasets are used in A* pathfinding.

  15. DEWEY: the DICOM-enabled workflow engine system.

    PubMed

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  16. Targeting Accuracy, Procedure Times and User Experience of 240 Experimental MRI Biopsies Guided by a Clinical Add-On Navigation System.

    PubMed

    Busse, Harald; Riedel, Tim; Garnov, Nikita; Thörmer, Gregor; Kahn, Thomas; Moche, Michael

    2015-01-01

    MRI is of great clinical utility for the guidance of special diagnostic and therapeutic interventions. The majority of such procedures are performed iteratively ("in-and-out") in standard, closed-bore MRI systems with control imaging inside the bore and needle adjustments outside the bore. The fundamental limitations of such an approach have led to the development of various assistance techniques, from simple guidance tools to advanced navigation systems. The purpose of this work was to thoroughly assess the targeting accuracy, workflow and usability of a clinical add-on navigation solution on 240 simulated biopsies by different medical operators. Navigation relied on a virtual 3D MRI scene with real-time overlay of the optically tracked biopsy needle. Smart reference markers on a freely adjustable arm ensured proper registration. Twenty-four operators - attending (AR) and resident radiologists (RR) as well as medical students (MS) - performed well-controlled biopsies of 10 embedded model targets (mean diameter: 8.5 mm, insertion depths: 17-76 mm). Targeting accuracy, procedure times and 13 Likert scores on system performance were determined (strong agreement: 5.0). Differences in diagnostic success rates (AR: 93%, RR: 88%, MS: 81%) were not significant. In contrast, between-group differences in biopsy times (AR: 4:15, RR: 4:40, MS: 5:06 min:sec) differed significantly (p<0.01). Mean overall rating was 4.2. The average operator would use the system again (4.8) and stated that the outcome justifies the extra effort (4.4). Lowest agreement was reported for the robustness against external perturbations (2.8). The described combination of optical tracking technology with an automatic MRI registration appears to be sufficiently accurate for instrument guidance in a standard (closed-bore) MRI environment. High targeting accuracy and usability was demonstrated on a relatively large number of procedures and operators. Between groups with different expertise there were significant differences in experimental procedure times but not in the number of successful biopsies.

  17. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    PubMed Central

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org. PMID:22559942

  18. Agile parallel bioinformatics workflow management using Pwrake.

    PubMed

    Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro

    2011-09-08

    In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows.

  19. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows. PMID:21899774

  20. Automated metadata--final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schissel, David

    This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less

  1. GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction

    NASA Astrophysics Data System (ADS)

    Chrysanthis, Panos K.

    1996-12-01

    Computer Science Department, University of Pittsburgh, Pittsburgh, PA 15260, USA This special issue focuses on current efforts to represent and support workflows that integrate information systems and human resources within a business or manufacturing enterprise. Workflows may also be viewed as an emerging computational paradigm for effective structuring of cooperative applications involving human users and access to diverse data types not necessarily maintained by traditional database management systems. A workflow is an automated organizational process (also called business process) which consists of a set of activities or tasks that need to be executed in a particular controlled order over a combination of heterogeneous database systems and legacy systems. Within workflows, tasks are performed cooperatively by either human or computational agents in accordance with their roles in the organizational hierarchy. The challenge in facilitating the implementation of workflows lies in developing efficient workflow management systems. A workflow management system (also called workflow server, workflow engine or workflow enactment system) provides the necessary interfaces for coordination and communication among human and computational agents to execute the tasks involved in a workflow and controls the execution orderings of tasks as well as the flow of data that these tasks manipulate. That is, the workflow management system is responsible for correctly and reliably supporting the specification, execution, and monitoring of workflows. The six papers selected (out of the twenty-seven submitted for this special issue of Distributed Systems Engineering) address different aspects of these three functional components of a workflow management system. In the first paper, `Correctness issues in workflow management', Kamath and Ramamritham discuss the important issue of correctness in workflow management that constitutes a prerequisite for the use of workflows in the automation of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity. The fourth and fifth papers are experience papers describing a workflow management system and a large scale workflow application, respectively. Schill and Mittasch, in `Workflow management systems on top of OSF DCE and OMG CORBA', describe a decentralized workflow management system and discuss its implementation using two standardized middleware platforms, namely, OSF DCE and OMG CORBA. The system supports a new approach to workflow management, introducing several new concepts such as data type management for integrating various types of data and quality of service for various services provided by servers. A problem common to both database applications and workflows is the handling of missing and incomplete information. This is particularly pervasive in an `electronic market' with a huge number of retail outlets producing and exchanging volumes of data, the application discussed in `Information flow in the DAMA project beyond database managers: information flow managers'. Motivated by the need for a method that allows a task to proceed in a timely manner if not all data produced by other tasks are available by its deadline, Russell et al propose an architectural framework and a language that can be used to detect, approximate and, later on, to adjust missing data if necessary. The final paper, `The evolution towards flexible workflow systems' by Nutt, is complementary to the other papers and is a survey of issues and of work related to both workflow and computer supported collaborative work (CSCW) areas. In particular, the paper provides a model and a categorization of the dimensions which workflow management and CSCW systems share. Besides summarizing the recent advancements towards efficient workflow management, the papers in this special issue suggest areas open to investigation and it is our hope that they will also provide the stimulus for further research and development in the area of workflow management systems.

  2. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    NASA Astrophysics Data System (ADS)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  3. New Directions in 3D Medical Modeling: 3D-Printing Anatomy and Functions in Neurosurgical Planning

    PubMed Central

    Árnadóttir, Íris; Gíslason, Magnús; Ólafsson, Ingvar

    2017-01-01

    This paper illustrates the feasibility and utility of combining cranial anatomy and brain function on the same 3D-printed model, as evidenced by a neurosurgical planning case study of a 29-year-old female patient with a low-grade frontal-lobe glioma. We herein report the rapid prototyping methodology utilized in conjunction with surgical navigation to prepare and plan a complex neurosurgery. The method introduced here combines CT and MRI images with DTI tractography, while using various image segmentation protocols to 3D model the skull base, tumor, and five eloquent fiber tracts. This 3D model is rapid-prototyped and coregistered with patient images and a reported surgical navigation system, establishing a clear link between the printed model and surgical navigation. This methodology highlights the potential for advanced neurosurgical preparation, which can begin before the patient enters the operation theatre. Moreover, the work presented here demonstrates the workflow developed at the National University Hospital of Iceland, Landspitali, focusing on the processes of anatomy segmentation, fiber tract extrapolation, MRI/CT registration, and 3D printing. Furthermore, we present a qualitative and quantitative assessment for fiber tract generation in a case study where these processes are applied in the preparation of brain tumor resection surgery. PMID:29065569

  4. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  5. Development of a user customizable imaging informatics-based intelligent workflow engine system to enhance rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent

    2014-03-01

    Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.

  6. Towards ontology-driven navigation of the lipid bibliosphere

    PubMed Central

    Baker, Christopher JO; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R

    2008-01-01

    Background The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. Results We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. Conclusion As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology. PMID:18315858

  7. Towards ontology-driven navigation of the lipid bibliosphere.

    PubMed

    Baker, Christopher Jo; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R

    2008-01-01

    The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology.

  8. Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online

    PubMed Central

    Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary

    2018-01-01

    Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574

  9. Sensor-based electromagnetic navigation to facilitate implantation of left ventricular leads in cardiac resynchronization therapy.

    PubMed

    Döring, Michael; Sommer, Philipp; Rolf, Sascha; Lucas, Johannes; Breithardt, Ole A; Hindricks, Gerhard; Richter, Sergio

    2015-02-01

    Implantation of cardiac resynchronization therapy (CRT) devices can be challenging, time consuming, and fluoroscopy intense. To facilitate placement of left ventricular (LV) leads, a novel electromagnetic navigation system (MediGuide™, St. Jude Medical, St. Paul, MN, USA) has been developed, displaying real-time 3-D location of sensor-embedded delivery tools superimposed on prerecorded X-ray cine-loops of coronary sinus venograms. We report our experience and advanced progress in the use of this new electromagnetic tracking system to guide LV lead implantation. Between January 2012 and December 2013, 71 consecutive patients (69 ± 9 years, 76% male) were implanted with a CRT device using the new electromagnetic tracking system. Demographics, procedural data, and periprocedural adverse events were gathered. The impact of the operator's experience, optimized workflow, and improved software technology on procedural data were analyzed. LV lead implantation was successfully achieved in all patients without severe adverse events. Total procedure time measured 87 ± 37 minutes and the median total fluoroscopy time (skin-to-skin) was 4.9 (2.5-7.8) minutes with a median dose-area-product of 476 (260-1056) cGy*cm(2) . An additional comparison with conventional CRT device implantations showed a significant reduction in fluoroscopy time from 8.0 (5.8; 11.5) to 4.5 (2.8; 7.3) minutes (P = 0.016) and radiation dose from 603 (330; 969) to 338 (176; 680) cGy*cm(2) , respectively (P = 0.044 ). Use of the new navigation system enables safe and successful LV lead placement with improved orientation and significantly reduced radiation exposure during CRT implantation. © 2014 Wiley Periodicals, Inc.

  10. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  11. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  12. Mimoza: web-based semantic zooming and navigation in metabolic networks.

    PubMed

    Zhukova, Anna; Sherman, David J

    2015-02-26

    The complexity of genome-scale metabolic models makes them quite difficult for human users to read, since they contain thousands of reactions that must be included for accurate computer simulation. Interestingly, hidden similarities between groups of reactions can be discovered, and generalized to reveal higher-level patterns. The web-based navigation system Mimoza allows a human expert to explore metabolic network models in a semantically zoomable manner: The most general view represents the compartments of the model; the next view shows the generalized versions of reactions and metabolites in each compartment; and the most detailed view represents the initial network with the generalization-based layout (where similar metabolites and reactions are placed next to each other). It allows a human expert to grasp the general structure of the network and analyze it in a top-down manner Mimoza can be installed standalone, or used on-line at http://mimoza.bordeaux.inria.fr/ , or installed in a Galaxy server for use in workflows. Mimoza views can be embedded in web pages, or downloaded as COMBINE archives.

  13. Multicriteria plan optimization in the hands of physicians: a pilot study in prostate cancer and brain tumors.

    PubMed

    Müller, Birgit S; Shih, Helen A; Efstathiou, Jason A; Bortfeld, Thomas; Craft, David

    2017-11-06

    The purpose of this study was to demonstrate the feasibility of physician driven planning in intensity modulated radiotherapy (IMRT) with a multicriteria optimization (MCO) treatment planning system and template based plan optimization. Exploiting the full planning potential of MCO navigation, this alternative planning approach intends to improve planning efficiency and individual plan quality. Planning was retrospectively performed on 12 brain tumor and 10 post-prostatectomy prostate patients previously treated with MCO-IMRT. For each patient, physicians were provided with a template-based generated Pareto surface of optimal plans to navigate, using the beam angles from the original clinical plans. We compared physician generated plans to clinically delivered plans (created by dosimetrists) in terms of dosimetric differences, physician preferences and planning times. Plan qualities were similar, however physician generated and clinical plans differed in the prioritization of clinical goals. Physician derived prostate plans showed significantly better sparing of the high dose rectum and bladder regions (p(D1) < 0.05; D1: dose received by 1% of the corresponding structure). Physicians' brain tumor plans indicated higher doses for targets and brainstem (p(D1) < 0.05). Within blinded plan comparisons physicians preferred the clinical plans more often (brain: 6:3 out of 12, prostate: 2:6 out of 10) (not statistically significant). While times of physician involvement were comparable for prostate planning, the new workflow reduced the average involved time for brain cases by 30%. Planner times were reduced for all cases. Subjective benefits, such as a better understanding of planning situations, were observed by clinicians through the insight into plan optimization and experiencing dosimetric trade-offs. We introduce physician driven planning with MCO for brain and prostate tumors as a feasible planning workflow. The proposed approach standardizes the planning process by utilizing site specific templates and integrates physicians more tightly into treatment planning. Physicians' navigated plan qualities were comparable to the clinical plans. Given the reduction of planning time of the planner and the equal or lower planning time of physicians, this approach has the potential to improve departmental efficiencies.

  14. The logistics of an inpatient dermatology service.

    PubMed

    Rosenbach, Misha

    2017-03-01

    Inpatient dermatology represents a unique challenge as caring for hospitalized patients with skin conditions is different from most dermatologists' daily outpatient practice. Declining rates of inpatient dermatology participation are often attributed to a number of factors, including challenges navigating the administrative burdens of hospital credentialing, acclimating to different hospital systems involving potential alternate electronic medical records systems, medical-legal concerns, and reimbursement concerns. This article aims to provide basic guidelines to help dermatologists establish a presence as a consulting physician in the inpatient hospital-based setting. The emphasis is on identifying potential pitfalls, problematic areas, and laying out strategies for tackling some of the challenges of inpatient dermatology including balancing financial concerns and optimizing reimbursements, tracking data and developing a plan for academic productivity, optimizing workflow, and identifying metrics to document the impact of an inpatient dermatology consult service. ©2017 Frontline Medical Communications.

  15. Toward Intraoperative Image-Guided Transoral Robotic Surgery

    PubMed Central

    Liu, Wen P.; Reaugamornrat, Sureerat; Deguet, Anton; Sorger, Jonathan M.; Siewerdsen, Jeffrey H.; Richmon, Jeremy; Taylor, Russell H.

    2014-01-01

    This paper presents the development and evaluation of video augmentation on the stereoscopic da Vinci S system with intraoperative image guidance for base of tongue tumor resection in transoral robotic surgery (TORS). Proposed workflow for image-guided TORS begins by identifying and segmenting critical oropharyngeal structures (e.g., the tumor and adjacent arteries and nerves) from preoperative computed tomography (CT) and/or magnetic resonance (MR) imaging. These preoperative planned data can be deformably registered to the intraoperative endoscopic view using mobile C-arm cone-beam computed tomography (CBCT) [1, 2]. Augmentation of TORS endoscopic video defining surgical targets and critical structures has the potential to improve navigation, spatial orientation, and confidence in tumor resection. Experiments in animal specimens achieved statistically significant improvement in target localization error when comparing the proposed image guidance system to simulated current practice. PMID:25525474

  16. Design and implementation of a secure workflow system based on PKI/PMI

    NASA Astrophysics Data System (ADS)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  17. Generic worklist handler for workflow-enabled products

    NASA Astrophysics Data System (ADS)

    Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas

    1999-07-01

    Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.

  18. Targeting Accuracy, Procedure Times and User Experience of 240 Experimental MRI Biopsies Guided by a Clinical Add-On Navigation System

    PubMed Central

    Busse, Harald; Riedel, Tim; Garnov, Nikita; Thörmer, Gregor; Kahn, Thomas; Moche, Michael

    2015-01-01

    Objectives MRI is of great clinical utility for the guidance of special diagnostic and therapeutic interventions. The majority of such procedures are performed iteratively ("in-and-out") in standard, closed-bore MRI systems with control imaging inside the bore and needle adjustments outside the bore. The fundamental limitations of such an approach have led to the development of various assistance techniques, from simple guidance tools to advanced navigation systems. The purpose of this work was to thoroughly assess the targeting accuracy, workflow and usability of a clinical add-on navigation solution on 240 simulated biopsies by different medical operators. Methods Navigation relied on a virtual 3D MRI scene with real-time overlay of the optically tracked biopsy needle. Smart reference markers on a freely adjustable arm ensured proper registration. Twenty-four operators – attending (AR) and resident radiologists (RR) as well as medical students (MS) – performed well-controlled biopsies of 10 embedded model targets (mean diameter: 8.5 mm, insertion depths: 17-76 mm). Targeting accuracy, procedure times and 13 Likert scores on system performance were determined (strong agreement: 5.0). Results Differences in diagnostic success rates (AR: 93%, RR: 88%, MS: 81%) were not significant. In contrast, between-group differences in biopsy times (AR: 4:15, RR: 4:40, MS: 5:06 min:sec) differed significantly (p<0.01). Mean overall rating was 4.2. The average operator would use the system again (4.8) and stated that the outcome justifies the extra effort (4.4). Lowest agreement was reported for the robustness against external perturbations (2.8). Conclusions The described combination of optical tracking technology with an automatic MRI registration appears to be sufficiently accurate for instrument guidance in a standard (closed-bore) MRI environment. High targeting accuracy and usability was demonstrated on a relatively large number of procedures and operators. Between groups with different expertise there were significant differences in experimental procedure times but not in the number of successful biopsies. PMID:26222443

  19. Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy.

    PubMed

    Yang, Liangjing; Wang, Junchen; Ando, Takehiro; Kubota, Akihiro; Yamashita, Hiromasa; Sakuma, Ichiro; Chiba, Toshio; Kobayashi, Etsuko

    2016-09-01

    Surgical navigation technology directed at fetoscopic procedures is relatively underdeveloped compared with other forms of endoscopy. The narrow fetoscopic field of views and the vast vascular network on the placenta make examination and photocoagulation treatment of twin-to-twin transfusion syndrome challenging. Though ultrasonography is used for intraoperative guidance, its navigational ability is not fully exploited. This work aims to integrate 3D ultrasound imaging and endoscopic vision seamlessly for placental vasculature mapping through a self-contained framework without external navigational devices. This is achieved through development, integration, and experimentation of novel navigational modules. Firstly, a framework design that addresses the current limitations based on identified gaps is conceptualized. Secondly, integration of navigational modules including (1) ultrasound-based localization, (2) image alignment, and (3) vision-based tracking to update the scene texture map is implemented. This updated texture map is projected to an ultrasound-constructed 3D model for photorealistic texturing of the 3D scene creating a panoramic view of the moving fetoscope. In addition, a collaborative scheme for the integration of the modular workflow system is proposed to schedule updates in a systematic fashion. Finally, experiments are carried out to evaluate each modular variation and an integrated collaborative scheme of the framework. The modules and the collaborative scheme are evaluated through a series of phantom experiments with controlled trajectories for repeatability. The collaborative framework demonstrated the best accuracy (5.2 % RMS error) compared with all the three single-module variations during the experiment. Validation on an ex vivo monkey placenta shows visual continuity of the freehand fetoscopic panorama. The proposed developed collaborative framework and the evaluation study of the framework variations provide analytical insights for effective integration of ultrasonography and endoscopy. This contributes to the development of navigation techniques in fetoscopic procedures and can potentially be extended to other applications in intraoperative imaging.

  20. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    NASA Astrophysics Data System (ADS)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  1. A Mobile App (BEDSide Mobility) to Support Nurses’ Tasks at the Patient's Bedside: Usability Study

    PubMed Central

    Weinhold, Thomas; Joe, Jonathan; Lovis, Christian; Blondon, Katherine

    2018-01-01

    Background The introduction of clinical information systems has increased the amount of clinical documentation. Although this documentation generally improves patient safety, it has become a time-consuming task for nurses, which limits their time with the patient. On the basis of a user-centered methodology, we have developed a mobile app named BEDSide Mobility to support nurses in their daily workflow and to facilitate documentation at the bedside. Objective The aim of the study was to assess the usability of the BEDSide Mobility app in terms of the navigation and interaction design through usability testing. Methods Nurses were asked to complete a scenario reflecting their daily work with patients. Their interactions with the app were captured with eye-tracking glasses and by using the think aloud protocol. After completing the tasks, participants filled out the system usability scale questionnaire. Descriptive statistics were used to summarize task completion rates and the users’ performance. Results A total of 10 nurses (aged 21-50) participated in the study. Overall, they were satisfied with the navigation, layout, and interaction design of the app, with the exception of one user who was unfamiliar with smartphones. The problems identified were related to the ambiguity of some icons, the navigation logic, and design inconsistency. Conclusions Besides the usability issues identified in the app, the participants’ results do indicate good usability, high acceptance, and high satisfaction with the developed app. However, the results must be taken with caution because of the poor ecological validity of the experimental setting. PMID:29563074

  2. Development of a novel imaging informatics-based system with an intelligent workflow engine (IWEIS) to support imaging-based clinical trials

    PubMed Central

    Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J

    2015-01-01

    Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169

  3. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE PAGES

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...

    2016-07-21

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  4. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  5. Worklist handling in workflow-enabled radiological application systems

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  6. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    PubMed

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  7. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  8. Modelling and analysis of workflow for lean supply chains

    NASA Astrophysics Data System (ADS)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  9. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  10. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  11. The standard-based open workflow system in GeoBrain (Invited)

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial intelligence technology. The GeoBrain workflow system has been used in multiple Earth science applications, including the monitoring of global agricultural drought, the assessment of flood damage, the derivation of national crop condition and progress information, and the detection of nuclear proliferation facilities and events.

  12. Visual tracking for multi-modality computer-assisted image guidance

    NASA Astrophysics Data System (ADS)

    Basafa, Ehsan; Foroughi, Pezhman; Hossbach, Martin; Bhanushali, Jasmine; Stolka, Philipp

    2017-03-01

    With optical cameras, many interventional navigation tasks previously relying on EM, optical, or mechanical guidance can be performed robustly, quickly, and conveniently. We developed a family of novel guidance systems based on wide-spectrum cameras and vision algorithms for real-time tracking of interventional instruments and multi-modality markers. These navigation systems support the localization of anatomical targets, support placement of imaging probe and instruments, and provide fusion imaging. The unique architecture - low-cost, miniature, in-hand stereo vision cameras fitted directly to imaging probes - allows for an intuitive workflow that fits a wide variety of specialties such as anesthesiology, interventional radiology, interventional oncology, emergency medicine, urology, and others, many of which see increasing pressure to utilize medical imaging and especially ultrasound, but have yet to develop the requisite skills for reliable success. We developed a modular system, consisting of hardware (the Optical Head containing the mini cameras) and software (components for visual instrument tracking with or without specialized visual features, fully automated marker segmentation from a variety of 3D imaging modalities, visual observation of meshes of widely separated markers, instant automatic registration, and target tracking and guidance on real-time multi-modality fusion views). From these components, we implemented a family of distinct clinical and pre-clinical systems (for combinations of ultrasound, CT, CBCT, and MRI), most of which have international regulatory clearance for clinical use. We present technical and clinical results on phantoms, ex- and in-vivo animals, and patients.

  13. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    PubMed Central

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  14. Motorization of a surgical microscope for intra-operative navigation and intuitive control.

    PubMed

    Finke, M; Schweikard, A

    2010-09-01

    During surgical procedures, various medical systems, e.g. microscope or C-arm, are used. Their precise and repeatable manual positioning can be very cumbersome and interrupts the surgeon's work flow. Robotized systems can assist the surgeon but they require suitable kinematics and control. However, positioning must be fast, flexible and intuitive. We describe a fully motorized surgical microscope. Hardware components as well as implemented applications are specified. The kinematic equations are described and a novel control concept is proposed. Our microscope combines fast manual handling with accurate, automatic positioning. Intuitive control is provided by a small remote control mounted to one of the surgical instruments. Positioning accuracy and repeatability are < 1 mm and vibrations caused by automatic movements fade away in about 1 s. The robotic system assists the surgeon, so that he can position the microscope precisely and repeatedly without interrupting the clinical workflow. The combination of manual und automatic control guarantees fast and flexible positioning during surgical procedures. Copyright 2010 John Wiley & Sons, Ltd.

  15. Workflow and Electronic Health Records in Small Medical Practices

    PubMed Central

    Ramaiah, Mala; Subrahmanian, Eswaran; Sriram, Ram D; Lide, Bettijoyce B

    2012-01-01

    This paper analyzes the workflow and implementation of electronic health record (EHR) systems across different functions in small physician offices. We characterize the differences in the offices based on the levels of computerization in terms of workflow, sources of time delay, and barriers to using EHR systems to support the entire workflow. The study was based on a combination of questionnaires, interviews, in situ observations, and data collection efforts. This study was not intended to be a full-scale time-and-motion study with precise measurements but was intended to provide an overview of the potential sources of delays while performing office tasks. The study follows an interpretive model of case studies rather than a large-sample statistical survey of practices. To identify time-consuming tasks, workflow maps were created based on the aggregated data from the offices. The results from the study show that specialty physicians are more favorable toward adopting EHR systems than primary care physicians are. The barriers to adoption of EHR systems by primary care physicians can be attributed to the complex workflows that exist in primary care physician offices, leading to nonstandardized workflow structures and practices. Also, primary care physicians would benefit more from EHR systems if the systems could interact with external entities. PMID:22737096

  16. The impact of computerized provider order entry systems on inpatient clinical workflow: a literature review.

    PubMed

    Niazkhani, Zahra; Pirnejad, Habibollah; Berg, Marc; Aarts, Jos

    2009-01-01

    Previous studies have shown the importance of workflow issues in the implementation of CPOE systems and patient safety practices. To understand the impact of CPOE on clinical workflow, we developed a conceptual framework and conducted a literature search for CPOE evaluations between 1990 and June 2007. Fifty-one publications were identified that disclosed mixed effects of CPOE systems. Among the frequently reported workflow advantages were the legible orders, remote accessibility of the systems, and the shorter order turnaround times. Among the frequently reported disadvantages were the time-consuming and problematic user-system interactions, and the enforcement of a predefined relationship between clinical tasks and between providers. Regarding the diversity of findings in the literature, we conclude that more multi-method research is needed to explore CPOE's multidimensional and collective impact on especially collaborative workflow.

  17. A Mobile App (BEDSide Mobility) to Support Nurses' Tasks at the Patient's Bedside: Usability Study.

    PubMed

    Ehrler, Frederic; Weinhold, Thomas; Joe, Jonathan; Lovis, Christian; Blondon, Katherine

    2018-03-21

    The introduction of clinical information systems has increased the amount of clinical documentation. Although this documentation generally improves patient safety, it has become a time-consuming task for nurses, which limits their time with the patient. On the basis of a user-centered methodology, we have developed a mobile app named BEDSide Mobility to support nurses in their daily workflow and to facilitate documentation at the bedside. The aim of the study was to assess the usability of the BEDSide Mobility app in terms of the navigation and interaction design through usability testing. Nurses were asked to complete a scenario reflecting their daily work with patients. Their interactions with the app were captured with eye-tracking glasses and by using the think aloud protocol. After completing the tasks, participants filled out the system usability scale questionnaire. Descriptive statistics were used to summarize task completion rates and the users' performance. A total of 10 nurses (aged 21-50) participated in the study. Overall, they were satisfied with the navigation, layout, and interaction design of the app, with the exception of one user who was unfamiliar with smartphones. The problems identified were related to the ambiguity of some icons, the navigation logic, and design inconsistency. Besides the usability issues identified in the app, the participants' results do indicate good usability, high acceptance, and high satisfaction with the developed app. However, the results must be taken with caution because of the poor ecological validity of the experimental setting. ©Frederic Ehrler, Thomas Weinhold, Jonathan Joe, Christian Lovis, Katherine Blondon. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 21.03.2018.

  18. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    PubMed

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  19. A Model of Workflow Composition for Emergency Management

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  20. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  1. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  2. 2D–3D radiograph to cone-beam computed tomography (CBCT) registration for C-arm image-guided robotic surgery

    PubMed Central

    Liu, Wen Pei; Otake, Yoshito; Azizian, Mahdi; Wagner, Oliver J.; Sorger, Jonathan M.; Armand, Mehran; Taylor, Russell H.

    2015-01-01

    Purpose C-arm radiographs are commonly used for intraoperative image guidance in surgical interventions. Fluoroscopy is a cost-effective real-time modality, although image quality can vary greatly depending on the target anatomy. Cone-beam computed tomography (CBCT) scans are sometimes available, so 2D–3D registration is needed for intra-procedural guidance. C-arm radiographs were registered to CBCT scans and used for 3D localization of peritumor fiducials during a minimally invasive thoracic intervention with a da Vinci Si robot. Methods Intensity-based 2D–3D registration of intraoperative radiographs to CBCT was performed. The feasible range of X-ray projections achievable by a C-arm positioned around a da Vinci Si surgical robot, configured for robotic wedge resection, was determined using phantom models. Experiments were conducted on synthetic phantoms and animals imaged with an OEC 9600 and a Siemens Artis zeego, representing the spectrum of different C-arm systems currently available for clinical use. Results The image guidance workflow was feasible using either an optically tracked OEC 9600 or a Siemens Artis zeego C-arm, resulting in an angular difference of Δθ : ~ 30°. The two C-arm systems provided TREmean ≤ 2.5 mm and TREmean ≤ 2.0 mm, respectively (i.e., comparable to standard clinical intraoperative navigation systems). Conclusions C-arm 3D localization from dual 2D–3D registered radiographs was feasible and applicable for intraoperative image guidance during da Vinci robotic thoracic interventions using the proposed workflow. Tissue deformation and in vivo experiments are required before clinical evaluation of this system. PMID:25503592

  3. The future of scientific workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less

  4. A three-level atomicity model for decentralized workflow management systems

    NASA Astrophysics Data System (ADS)

    Ben-Shaul, Israel Z.; Heineman, George T.

    1996-12-01

    A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.

  5. Asterism: an integrated, complete, and open-source approach for running seismologist continuous data-intensive analysis on heterogeneous systems

    NASA Astrophysics Data System (ADS)

    Ferreira da Silva, R.; Filgueira, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus. Asterism is freely available online at http://github.com/dispel4py/pegasus_dispel4py.

  6. Intraoperative computed tomography with integrated navigation system in spinal stabilizations.

    PubMed

    Zausinger, Stefan; Scheder, Ben; Uhl, Eberhard; Heigl, Thomas; Morhard, Dominik; Tonn, Joerg-Christian

    2009-12-15

    STUDY DESIGN.: A prospective interventional case-series study plus a retrospective analysis of historical patients for comparison of data. OBJECTIVE.: To evaluate workflow, feasibility, and clinical outcome of navigated stabilization procedures with data acquisition by intraoperative computed tomography. SUMMARY OF BACKGROUND DATA.: Routine fluoroscopy to assess pedicle screw placement is not consistently reliable. Our hypothesis was that image-guided spinal navigation using an intraoperative CT-scanner can improve the safety and precision of spinal stabilization surgery. METHODS.: CT data of 94 patients (thoracolumbar [n = 66], C1/2 [n = 12], cervicothoracic instability [n = 16]) were acquired after positioning the patient in the final surgical position. A sliding gantry 40-slice CT was used for image acquisition. Data were imported to a frameless infrared-based neuronavigation workstation. Intraoperative CT was obtained to assess the accuracy of instrumentation and, if necessary, the extent of decompression. All patients were clinically evaluated by Odom-criteria after surgery and after 3 months. RESULTS.: Computed accuracy of the navigation system reached <2 mm (0.95 +/- 0.3 mm) in all cases. Additional time necessary for the preoperative image acquisition including data transfer was 14 +/- 5 minutes. The duration of interrupting the surgical process for iCT until resumption of surgery was 9 +/- 2.5 minutes. Control-iCT revealed incorrect screw position >/=2 mm without persistent neurologic or vascular damage in 20/414 screws (4.8%) leading to immediate correction of 10 screws (2.4%). Control-iCT changed the course of surgery in 8 cases (8.5% of all patients). The overall revision rate was 8.5% (4 wound revisions, 2 CSF fistulas, and 2 epidural hematomas). There was no reoperation due to implant malposition. According to Odom-criteria all patients experienced a clinical improvement. A retrospective analysis of 182 patients with navigated thoracolumbar transpedicular stabilizations in the preiCT era revealed an overall revision rate of 10.4% with 4.4% of patients requiring screw revision. CONCLUSION.: Intraoperative CT in combination with neuronavigation provides high accuracy of screw placement and thus safety for patients undergoing spinal stabilization. Reoperations due to implant malpositions could be completely avoided. The system can be installed into a pre-existing operating environment without need for special surgical instruments. The procedure is rapid and easy to perform without restricted access to the patient and-by replacing pre- and postoperative imaging-is not associated with an additional exposure to radiation. Multidisciplinary use increases utilization of the system and thus improves cost-efficiency relation.

  7. Adaptive Texture Synthesis for Large Scale City Modeling

    NASA Astrophysics Data System (ADS)

    Despine, G.; Colleu, T.

    2015-02-01

    Large scale city models textured with aerial images are well suited for bird-eye navigation but generally the image resolution does not allow pedestrian navigation. One solution to face this problem is to use high resolution terrestrial photos but it requires huge amount of manual work to remove occlusions. Another solution is to synthesize generic textures with a set of procedural rules and elementary patterns like bricks, roof tiles, doors and windows. This solution may give realistic textures but with no correlation to the ground truth. Instead of using pure procedural modelling we present a method to extract information from aerial images and adapt the texture synthesis to each building. We describe a workflow allowing the user to drive the information extraction and to select the appropriate texture patterns. We also emphasize the importance to organize the knowledge about elementary pattern in a texture catalogue allowing attaching physical information, semantic attributes and to execute selection requests. Roofs are processed according to the detected building material. Façades are first described in terms of principal colours, then opening positions are detected and some window features are computed. These features allow selecting the most appropriate patterns from the texture catalogue. We experimented this workflow on two samples with 20 cm and 5 cm resolution images. The roof texture synthesis and opening detection were successfully conducted on hundreds of buildings. The window characterization is still sensitive to the distortions inherent to the projection of aerial images onto the facades.

  8. Experimental evaluation of a flexible I/O architecture for accelerating workflow engines in ultrascale environments

    DOE PAGES

    Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...

    2016-10-06

    The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less

  9. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  10. Development of the workflow kine systems for support on KAIZEN.

    PubMed

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  11. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreher, M.; Peterka, T.

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steeringmore » based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.« less

  12. Requirements for Workflow-Based EHR Systems - Results of a Qualitative Study.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2016-01-01

    Today's high quality healthcare delivery strongly relies on efficient electronic health records (EHR). These EHR systems or in general healthcare IT-systems are usually developed in a static manner according to a given workflow. Hence, they are not flexible enough to enable access to EHR data and to execute individual actions within a consultation. This paper reports on requirements pointed by experts in the domain of diabetes mellitus to design a system for supporting dynamic workflows to serve personalization within a medical activity. Requirements were collected by means of expert interviews. These interviews completed a conducted triangulation approach, aimed to gather requirements for workflow-based EHR interactions. The data from the interviews was analyzed through a qualitative approach resulting in a set of requirements enhancing EHR functionality from the user's perspective. Requirements were classified according to four different categorizations: (1) process-related requirements, (2) information needs, (3) required functions, (4) non-functional requirements. Workflow related requirements were identified which should be considered when developing and deploying EHR systems.

  13. Data Mining to Chart the Arctic: Analysis of Approaches to Incorporate Outside Source Data into NOAA Office of Coast Survey Workflow

    NASA Astrophysics Data System (ADS)

    Rennoll, V.

    2016-02-01

    The National Centers for Environmental Information provide public access to a wealth of seafloor mapping data, both from National Ocean Service hydrographic surveys and outside source collections. Utilizing the outside source data to improve nautical charts created by the National Oceanic and Atmospheric Administration (NOAA) is an appealing alternative to traditional surveys, largely in areas with significant data gaps where hydrographic surveys are not planned. However, much of the outside data are collected in transit lines and lack traditional overlapping main scheme lines and crosslines. Spanning multiple years and vessels, these transit line data collections were obtained using disparate operating procedures and have inconsistent qualities. Here, a workflow was developed to ingest these variable depth data within a defined region by assessing their quality and utility for nautical charting. The workflow was evaluated with a navigationally significant area in the Bering Sea, where bathymetric data collected from ten vessels over a period of twelve years were available. The outside data were shown to be of sufficient quality through comparisons with existing NOAA surveys and then used to demonstrate where the data could provide new or updated information on nautical charts, and provide reconnaissance for future hydrographic planning. The utility assessment of the data, however, was hindered by lack of a verified survey-scale sounding database, against which the outside source data could be compared. Having developed the workflow, it is recommended that further outside data is ingested by NOAA's Office of Coast Survey and that a database is developed with full-scale chart soundings for outside data comparisons.

  14. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    PubMed

    Hartman, Douglas J

    2015-06-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    PubMed

    Hartman, Douglas J

    2016-03-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Preliminary development of a workstation for craniomaxillofacial surgical procedures: introducing a computer-assisted planning and execution system.

    PubMed

    Gordon, Chad R; Murphy, Ryan J; Coon, Devin; Basafa, Ehsan; Otake, Yoshito; Al Rakan, Mohammed; Rada, Erin; Susarla, Srinivas; Susarla, Sriniras; Swanson, Edward; Fishman, Elliot; Santiago, Gabriel; Brandacher, Gerald; Liacouras, Peter; Grant, Gerald; Armand, Mehran

    2014-01-01

    Facial transplantation represents one of the most complicated scenarios in craniofacial surgery because of skeletal, aesthetic, and dental discrepancies between donor and recipient. However, standard off-the-shelf vendor computer-assisted surgery systems may not provide custom features to mitigate the increased complexity of this particular procedure. We propose to develop a computer-assisted surgery solution customized for preoperative planning, intraoperative navigation including cutting guides, and dynamic, instantaneous feedback of cephalometric measurements/angles as needed for facial transplantation and other related craniomaxillofacial procedures. We developed the Computer-Assisted Planning and Execution (CAPE) workstation to assist with planning and execution of facial transplantation. Preoperative maxillofacial computed tomography (CT) scans were obtained on 4 size-mismatched miniature swine encompassing 2 live face-jaw-teeth transplants. The system was tested in a laboratory setting using plastic models of mismatched swine, after which the system was used in 2 live swine transplants. Postoperative CT imaging was obtained and compared with the preoperative plan and intraoperative measures from the CAPE workstation for both transplants. Plastic model tests familiarized the team with the CAPE workstation and identified several defects in the workflow. Live swine surgeries demonstrated utility of the CAPE system in the operating room, showing submillimeter registration error of 0.6 ± 0.24 mm and promising qualitative comparisons between intraoperative data and postoperative CT imaging. The initial development of the CAPE workstation demonstrated that integration of computer planning and intraoperative navigation for facial transplantation are possible with submillimeter accuracy. This approach can potentially improve preoperative planning, allowing ideal donor-recipient matching despite significant size mismatch, and accurate surgical execution for numerous types of craniofacial and orthognathic surgical procedures.

  17. 76 FR 71928 - Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-21

    ... Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011-D027... Wide Area WorkFlow (WAWF) and TRICARE Encounter Data System (TEDS). WAWF, which electronically... civil emergencies, when access to Wide Area WorkFlow by those contractors is not feasible; (4) Purchases...

  18. Flexible Early Warning Systems with Workflows and Decision Tables

    NASA Astrophysics Data System (ADS)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.

  19. Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.

  20. Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization

    PubMed Central

    Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley

    2015-01-01

    Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173

  1. Innovations in Medication Preparation Safety and Wastage Reduction: Use of a Workflow Management System in a Pediatric Hospital.

    PubMed

    Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara

    2017-01-01

    Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

  2. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    PubMed

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.

  3. A Six‐Stage Workflow for Robust Application of Systems Pharmacology

    PubMed Central

    Gadkar, K; Kirouac, DC; Mager, DE; van der Graaf, PH

    2016-01-01

    Quantitative and systems pharmacology (QSP) is increasingly being applied in pharmaceutical research and development. One factor critical to the ultimate success of QSP is the establishment of commonly accepted language, technical criteria, and workflows. We propose an integrated workflow that bridges conceptual objectives with underlying technical detail to support the execution, communication, and evaluation of QSP projects. PMID:27299936

  4. An electronic dashboard to improve nursing care.

    PubMed

    Tan, Yung-Ming; Hii, Joshua; Chan, Katherine; Sardual, Robert; Mah, Benjamin

    2013-01-01

    With the introduction of CPOE systems, nurses in a Singapore hospital were facing difficulties monitoring key patient information such as critical tasks and alerts. Issues include unfriendly user interfaces of clinical systems, information overload, and the loss of visual cues for action due to paperless workflows. The hospital decided to implement an interactive electronic dashboard on top of their CPOE system to improve visibility of vital patient data. A post-implementation survey was performed to gather end-user feedback and evaluate factors that influence user satisfaction of the dashboard. Questionnaires were sent to all nurses of five pilot wards. 106 valid responses were received. User adoption was good with 86% of nurses using the dashboard every shift. Mean satisfaction score was 3.6 out of 5. User satisfaction was strongly and positively correlated to the system's perceived impact on work efficiency and care quality. From qualitative feedback, nurses generally agreed that the dashboard had improved their awareness of critical patient issues without the hassle of navigating a CPOE system. This study shows that an interactive clinical dashboard when properly integrated with a CPOE system could be a useful tool to improve daily patient care.

  5. Managing the life cycle of electronic clinical documents.

    PubMed

    Payne, Thomas H; Graham, Gail

    2006-01-01

    To develop a model of the life cycle of clinical documents from inception to use in a person's medical record, including workflow requirements from clinical practice, local policy, and regulation. We propose a model for the life cycle of clinical documents as a framework for research on documentation within electronic medical record (EMR) systems. Our proposed model includes three axes: the stages of the document, the roles of those involved with the document, and the actions those involved may take on the document at each stage. The model includes the rules to describe who (in what role) can perform what actions on the document, and at what stages they can perform them. Rules are derived from needs of clinicians, and requirements of hospital bylaws and regulators. Our model encompasses current practices for paper medical records and workflow in some EMR systems. Commercial EMR systems include methods for implementing document workflow rules. Workflow rules that are part of this model mirror functionality in the Department of Veterans Affairs (VA) EMR system where the Authorization/ Subscription Utility permits document life cycle rules to be written in English-like fashion. Creating a model of the life cycle of clinical documents serves as a framework for discussion of document workflow, how rules governing workflow can be implemented in EMR systems, and future research of electronic documentation.

  6. Analog to digital workflow improvement: a quantitative study.

    PubMed

    Wideman, Catherine; Gallet, Jacqueline

    2006-01-01

    This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.

  7. Structured recording of intraoperative surgical workflows

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Durstewitz, N.; Fischer, M.; Strauss, G.; Dietz, A.; Meixensberger, J.; Jannin, P.; Cleary, K.; Lemke, H. U.; Burgert, O.

    2006-03-01

    Surgical Workflows are used for the methodical and scientific analysis of surgical interventions. The approach described here is a step towards developing surgical assist systems based on Surgical Workflows and integrated control systems for the operating room of the future. This paper describes concepts and technologies for the acquisition of Surgical Workflows by monitoring surgical interventions and their presentation. Establishing systems which support the Surgical Workflow in operating rooms requires a multi-staged development process beginning with the description of these workflows. A formalized description of surgical interventions is needed to create a Surgical Workflow. This description can be used to analyze and evaluate surgical interventions in detail. We discuss the subdivision of surgical interventions into work steps regarding different levels of granularity and propose a recording scheme for the acquisition of manual surgical work steps from running interventions. To support the recording process during the intervention, we introduce a new software architecture. Core of the architecture is our Surgical Workflow editor that is intended to deal with the manifold, complex and concurrent relations during an intervention. Furthermore, a method for an automatic generation of graphs is shown which is able to display the recorded surgical work steps of the interventions. Finally we conclude with considerations about extensions of our recording scheme to close the gap to S-PACS systems. The approach was used to record 83 surgical interventions from 6 intervention types from 3 different surgical disciplines: ENT surgery, neurosurgery and interventional radiology. The interventions were recorded at the University Hospital Leipzig, Germany and at the Georgetown University Hospital, Washington, D.C., USA.

  8. Text mining meets workflow: linking U-Compare with Taverna

    PubMed Central

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  9. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  10. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin

    The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less

  12. Using location tracking data to assess efficiency in established clinical workflows.

    PubMed

    Meyer, Mark; Fairbrother, Pamela; Egan, Marie; Chueh, Henry; Sandberg, Warren S

    2006-01-01

    Location tracking systems are becoming more prevalent in clinical settings yet applications still are not common. We have designed a system to aid in the assessment of clinical workflow efficiency. Location data is captured from active RFID tags and processed into usable data. These data are stored and presented visually with trending capability over time. The system allows quick assessments of the impact of process changes on workflow, and isolates areas for improvement.

  13. Workflows for microarray data processing in the Kepler environment.

    PubMed

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.

  14. Prototype of Kepler Processing Workflows For Microscopy And Neuroinformatics

    PubMed Central

    Astakhov, V.; Bandrowski, A.; Gupta, A.; Kulungowski, A.W.; Grethe, J.S.; Bouwer, J.; Molina, T.; Rowley, V.; Penticoff, S.; Terada, M.; Wong, W.; Hakozaki, H.; Kwon, O.; Martone, M.E.; Ellisman, M.

    2016-01-01

    We report on progress of employing the Kepler workflow engine to prototype “end-to-end” application integration workflows that concern data coming from microscopes deployed at the National Center for Microscopy Imaging Research (NCMIR). This system is built upon the mature code base of the Cell Centered Database (CCDB) and integrated rule-oriented data system (IRODS) for distributed storage. It provides integration with external projects such as the Whole Brain Catalog (WBC) and Neuroscience Information Framework (NIF), which benefit from NCMIR data. We also report on specific workflows which spawn from main workflows and perform data fusion and orchestration of Web services specific for the NIF project. This “Brain data flow” presents a user with categorized information about sources that have information on various brain regions. PMID:28479932

  15. Workflow technology: the new frontier. How to overcome the barriers and join the future.

    PubMed

    Shefter, Susan M

    2006-01-01

    Hospitals are catching up to the business world in the introduction of technology systems that support professional practice and workflow. The field of case management is highly complex and interrelates with diverse groups in diverse locations. The last few years have seen the introduction of Workflow Technology Tools, which can improve the quality and efficiency of discharge planning by the case manager. Despite the availability of these wonderful new programs, many case managers are hesitant to adopt the new technology and workflow. For a myriad of reasons, a computer-based workflow system can seem like a brick wall. This article discusses, from a practitioner's point of view, how professionals can gain confidence and skill to get around the brick wall and join the future.

  16. An Auto-management Thesis Program WebMIS Based on Workflow

    NASA Astrophysics Data System (ADS)

    Chang, Li; Jie, Shi; Weibo, Zhong

    An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.

  17. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    NASA Astrophysics Data System (ADS)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  18. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  19. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149

  20. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  1. Implementation of workflow engine technology to deliver basic clinical decision support functionality.

    PubMed

    Huser, Vojtech; Rasmussen, Luke V; Oberg, Ryan; Starren, Justin B

    2011-04-10

    Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform.

  2. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  3. Information Issues and Contexts that Impair Team Based Communication Workflow: A Palliative Sedation Case Study.

    PubMed

    Cornett, Alex; Kuziemsky, Craig

    2015-01-01

    Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.

  4. High-volume workflow management in the ITN/FBI system

    NASA Astrophysics Data System (ADS)

    Paulson, Thomas L.

    1997-02-01

    The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.

  5. [Integration of the radiotherapy irradiation planning in the digital workflow].

    PubMed

    Röhner, F; Schmucker, M; Henne, K; Momm, F; Bruggmoser, G; Grosu, A-L; Frommhold, H; Heinemann, F E

    2013-02-01

    At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority.

  6. Design and implementation of workflow engine for service-oriented architecture

    NASA Astrophysics Data System (ADS)

    Peng, Shuqing; Duan, Huining; Chen, Deyun

    2009-04-01

    As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.

  7. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  8. Panning artifacts in digital pathology images

    NASA Astrophysics Data System (ADS)

    Avanaki, Ali R. N.; Lanciault, Christian; Espig, Kathryn S.; Xthona, Albert; Kimpe, Tom R. L.

    2017-03-01

    In making a pathologic diagnosis, a pathologist uses cognitive processes: perception, attention, memory, and search (Pena and Andrade-Filho, 2009). Typically, this involves focus while panning from one region of a slide to another, using either a microscope in a traditional workflow or software program and display in a digital pathology workflow (DICOM Standard Committee, 2010). We theorize that during panning operation, the pathologist receives information important to diagnosis efficiency and/or correctness. As compared to an optical microscope, panning in a digital pathology image involves some visual artifacts due to the following: (i) the frame rate is finite; (ii) time varying visual signals are reconstructed using imperfect zero-order hold. Specifically, after pixel's digital drive is changed, it takes time for a pixel to emit the expected amount of light. Previous work suggests that 49% of navigation is conducted in low-power/overview with digital pathology (Molin et al., 2015), but the influence of display factors has not been measured. We conducted a reader study to establish a relationship between display frame rate, panel response time, and threshold panning speed (above which the artifacts become noticeable). Our results suggest visual tasks that involve tissue structure are more impacted by the simulated panning artifacts than those that only involve color (e.g., staining intensity estimation), and that the panning artifacts versus normalized panning speed has a peak behavior which is surprising and may change for a diagnostic task. This is work in progress and our final findings should be considered in designing future digital pathology systems.

  9. NASA Tech Briefs, February 2014

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics include: JWST Integrated Simulation and Test (JIST) Core; Software for Non-Contact Measurement of an Individual's Heart Rate Using a Common Camera; Rapid Infrared Pixel Grating Response Testbed; Temperature Measurement and Stabilization in a Birefringent Whispering Gallery Resonator; JWST IV and V Simulation and Test (JIST) Solid State Recorder (SSR) Simulator; Development of a Precision Thermal Doubler for Deep Space; Improving Friction Stir Welds Using Laser Peening; Methodology of Evaluating Margins of Safety in Critical Brazed Joints; Interactive Inventory Monitoring; Sensor for Spatial Detection of Single-Event Effects in Semiconductor-Based Electronics; Reworked CCGA-624 Interconnect Package Reliability for Extreme Thermal Environments; Current-Controlled Output Driver for Directly Coupled Loads; Bulk Metallic Glasses and Matrix Composites as Spacecraft Shielding; Touch Temperature Coating for Electrical Equipment on Spacecraft; Li-Ion Electrolytes Containing Flame-Retardant Additives; Autonomous Robotic Manipulation (ARM); CARVE Log; Platform Perspective Toolkit; Convex Hull-Based Plume and Anomaly Detection; Pre-Filtration of GOSAT Data Using Only Level 1 Data and an Intelligent Filter to Remove Low Clouds; Affordability Comparison Tool - ACT; "Ascent - Commemorating Shuttle" for iPad; Cassini Mission App; Light-Weight Workflow Engine: A Server for Executing Generic Workflows; Model for System Engineering of the CheMin Instrument; Timeline Central Concepts; Parallel Particle Filter Toolkit; Particle Filter Simulation and Analysis Enabling Non-Traditional Navigation; Quasi-Terminator Orbits for Mapping Small Primitive Bodies; The Subgrid-Scale Scalar Variance Under Supercritical Pressure Conditions; Sliding Gait for ATHLETE Mobility; and Automated Generation of Adaptive Filter Using a Genetic Algorithm and Cyclic Rule Reduction.

  10. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    PubMed

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  11. 78 FR 68861 - Certain Navigation Products, Including GPS Devices, Navigation and Display Systems, Radar Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-15

    ... Devices, Navigation and Display Systems, Radar Systems, Navigational Aids, Mapping Systems and Related... navigation products, including GPS devices, navigation and display systems, radar systems, navigational aids..., radar systems, navigational aids, mapping systems and related software by reason of infringement of one...

  12. An automated analysis workflow for optimization of force-field parameters using neutron scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Vickie E.; Borreguero, Jose M.; Bhowmik, Debsindhu

    Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parametersmore » which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.« less

  13. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  14. Using EHR audit trail logs to analyze clinical workflow: A case study from community-based ambulatory clinics.

    PubMed

    Wu, Danny T Y; Smart, Nikolas; Ciemins, Elizabeth L; Lanham, Holly J; Lindberg, Curt; Zheng, Kai

    2017-01-01

    To develop a workflow-supported clinical documentation system, it is a critical first step to understand clinical workflow. While Time and Motion studies has been regarded as the gold standard of workflow analysis, this method can be resource consuming and its data may be biased due to the cognitive limitation of human observers. In this study, we aimed to evaluate the feasibility and validity of using EHR audit trail logs to analyze clinical workflow. Specifically, we compared three known workflow changes from our previous study with the corresponding EHR audit trail logs of the study participants. The results showed that EHR audit trail logs can be a valid source for clinical workflow analysis, and can provide an objective view of clinicians' behaviors, multi-dimensional comparisons, and a highly extensible analysis framework.

  15. An architecture model for multiple disease management information systems.

    PubMed

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  16. Task Delegation Based Access Control Models for Workflow Systems

    NASA Astrophysics Data System (ADS)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  17. Chang'E-3 data pre-processing system based on scientific workflow

    NASA Astrophysics Data System (ADS)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  18. Metaworkflows and Workflow Interoperability for Heliophysics

    NASA Astrophysics Data System (ADS)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They- implement Science Cases (the definition of a scientific challenge) by composing different Basic Workflows. The third and last layer,Iterative Science Workflows, is developed in WSPGRADE. It executes sub-workflows (either Basic or Science Workflows) as parameter sweep jobs to investigate Science Cases on large multiple data sets. So far, this approach has proven fruitful for three Science Cases of which one has been completed and two are still being tested.

  19. Ergonomic design in the operating room: information technologies

    NASA Astrophysics Data System (ADS)

    Morita, Mark M.; Ratib, Osman

    2005-04-01

    The ergonomic design in the Surgical OR of information technology systems has been and continues to be a large problem. Numerous disparate information systems with unique hardware and display configurations create an environment similar to the chaotic environments of air traffic control. Patient information systems tend to show all available statistics making it difficult to isolate the key, relevant vitals for the patient. Interactions in this sterile environment are still being done with the traditional keyboard and mouse designed for cubicle office workflows. This presentation will address the shortcomings of the current design paradigm in the Surgical OR that relate to Information Technology systems. It will offer a perspective that addresses the ergonomic deficiencies and predicts how future technological innovations will integrate into this vision. Part of this vision includes a Surgical OR PACS prototype, developed by GE Healthcare Technologies, that addresses ergonomic challenges of PACS in the OR that include lack of portability, sterile field integrity, and UI targeted for diagnostic radiologists. GWindows (gesture control) developed by Microsoft Research and Voice command will allow for the surgeons to navigate and review diagnostic imagery without using the conventional keyboard and mouse that disrupt the integrity of the sterile field. This prototype also demonstrates how a wireless, battery powered, self contained mobile PACS workstation can be optimally positioned for a surgeon to reference images during an intervention as opposed to the current pre-operative review. Lessons learned from the creation of the Surgical OR PACS Prototype have demonstrated that PACS alone is not the end all solution in the OR. Integration of other disparate information systems and presentation of this information in simple, easy to navigate information packets will enable smoother interactions for the surgeons and other healthcare professionals in the OR. More intuitive IT system interaction is required for all the key players in the OR not just the surgeons. To improve interactions, there are a number of emerging technologies that have the potential to revolutionize the way healthcare professionals interact with computer-based applications in the Surgical OR. A number of these technologies will enable surgeons to interact with vital data without interrupting the sterile field or maneuvering their bodies to view relevant information - information will automatically display for healthcare individuals in a just-in-time manner without navigational challenges.

  20. Support for Taverna workflows in the VPH-Share cloud platform.

    PubMed

    Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F

    2017-07-01

    To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Interactive web-based portals to improve patient navigation and connect patients with primary care and specialty services in underserved communities.

    PubMed

    Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne

    2014-01-01

    This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access.

  2. Interactive Web-based Portals to Improve Patient Navigation and Connect Patients with Primary Care and Specialty Services in Underserved Communities

    PubMed Central

    Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne

    2014-01-01

    This article presents a case study in the redesign, development, and implementation of a web-based healthcare clinic search tool for virtual patient navigation in underserved populations in Texas. It describes the workflow, assessment of system requirements, and design and implementation of two online portals: Project Safety Net and the Breast Health Portal. The primary focus of the study was to demonstrate the use of health information technology for the purpose of bridging the gap between underserved populations and access to healthcare. A combination of interviews and focus groups was used to guide the development process. Interviewees were asked a series of questions about usage, usability, and desired features of the new system. The redeveloped system offers a multitier architecture consisting of data, business, and presentation layers. The technology used in the new portals include Microsoft .NET Framework 3.5, Microsoft SQL Server 2008, Google Maps JavaScript API v3, jQuery, Telerik RadControls (ASP.NET AJAX), and HTML. The redesigned portals have 548 registered clinics, and they have averaged 355 visits per month since their launch in late 2011, with the average user visiting five pages per visit. Usage has remained relatively constant over time, with an average of 142 new users (40 percent) each month. This study demonstrates the successful application of health information technology to improve access to healthcare and the successful adoption of the technology by targeted end users. The portals described in this study could be replicated by health information specialists in other areas of the United States to address disparities in healthcare access. PMID:24808806

  3. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    PubMed Central

    Ho, Jonhan; Aridor, Orly; Parwani, Anil V.

    2012-01-01

    Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users’ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists’ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number of processes that may possibly improve following the introduction of digital pathology systems were identified. These work processes included case management, case examination and review, and final case reporting. Furthermore, a digital slide system should integrate with the anatomic pathologic laboratory information system. Conclusions: To our knowledge, this is the first study that utilized the contextual inquiry method to document AP workflow. Findings were used to establish key requirements for the design of digital pathology systems. PMID:23243553

  4. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    PubMed Central

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  5. A case study on the impacts of computerized provider order entry (CPOE) system on hospital clinical workflow.

    PubMed

    Mominah, Maher; Yunus, Faisel; Househ, Mowafa S

    2013-01-01

    Computerized provider order entry (CPOE) is a health informatics system that helps health care providers create and manage orders for medications and other health care services. Through the automation of the ordering process, CPOE has improved the overall efficiency of hospital processes and workflow. In Saudi Arabia, CPOE has been used for years, with only a few studies evaluating the impacts of CPOE on clinical workflow. In this paper, we discuss the experience of a local hospital with the use of CPOE and its impacts on clinical workflow. Results show that there are many issues related to the implementation and use of CPOE within Saudi Arabia that must be addressed, including design, training, medication errors, alert fatigue, and system dep Recommendations for improving CPOE use within Saudi Arabia are also discussed.

  6. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems.

    PubMed

    Li, Xuejun; Xu, Jia; Yang, Yun

    2015-01-01

    Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO) algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.

  7. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems

    PubMed Central

    Li, Xuejun; Xu, Jia; Yang, Yun

    2015-01-01

    Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO) algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts. PMID:26357510

  8. Implementation of workflow engine technology to deliver basic clinical decision support functionality

    PubMed Central

    2011-01-01

    Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform. PMID:21477364

  9. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    PubMed

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Modernizing Earth and Space Science Modeling Workflows in the Big Data Era

    NASA Astrophysics Data System (ADS)

    Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.

    2017-12-01

    Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and automation in the near term, and longer term investments in virtualized environments for improved scalability, tolerance for lossy data compression, novel data-centric memory and storage technologies, and tools for peer reviewing, preserving and sharing workflows, as well as fundamental statistical and machine learning algorithms.

  11. Build and Execute Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Qiang

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows willmore » compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less

  12. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  13. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  14. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  15. Digital transformation in home care. A case study.

    PubMed

    Bennis, Sandy; Costanzo, Diane; Flynn, Ann Marie; Reidy, Agatha; Tronni, Catherine

    2007-01-01

    Simply implementing software and technology does not assure that an organization's targeted clinical and financial goals will be realized. No longer is it possible to roll out a new system--by solely providing end user training and overlaying it on top of already inefficient workflows and outdated roles--and know with certainty that targets will be met. At Virtua Health's Home Care, based in south New Jersey, implementation of their electronic system initially followed this more traditional approach. Unable to completely attain their earlier identified return on investment, they enlisted the help of a new role within their health system, that of the nurse informaticist. Knowledgeable in complex clinical processes and not bound by the technology at hand, the informaticist analyzed physical workflow, digital workflow, roles and physical layout. Leveraging specific tools such as change acceleration, workouts and LEAN, the informaticist was able to redesign workflow and support new levels of functionality. This article provides a view from the "finish line", recounting how this role worked with home care to assimilate information delivery into more efficient processes and align resources to support the new workflow, ultimately achieving real tangible returns.

  16. It's All About the Data: Workflow Systems and Weather

    NASA Astrophysics Data System (ADS)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.

  17. Dynamic reusable workflows for ocean science

    USGS Publications Warehouse

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic notebooks across the geoscience domains.

  18. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    PubMed

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  19. [Change in process management by implementing RIS, PACS and flat-panel detectors].

    PubMed

    Imhof, H; Dirisamer, A; Fischer, H; Grampp, S; Heiner, L; Kaderk, M; Krestan, C; Kainberger, F

    2002-05-01

    Implementation of radiological information systems (RIS) and picture archiving and communicating systems (PACS) results in significant changes of workflow in a radiological department. Additional connection with flat-panel detectors leads to a shortening of the work process. RIS and PACS implementation alone reduces the complete workflow by 21-80%. With flatpanel technology the image production process is further shortened by 25-30%. The workflow-steps are changed from original 17-12 with the implementation of RIS and PACS and to 5 with the integrated use of flatpanels. This clearly recognizable advantages in the workflow need an according financial investment. Several studies could show that the capitalisation-factor calculated over eight years is positive, with a gain range between 5-25%. Whether the additional implementation of flatpanel detectors results also in a positive capitalisation over the years, cannot be estimated exactly, at the moment, because the experiences are too short. Particularly critical are the interfaces, which needs a constant quality control. Our flatpanel detector-system is fixed, special images--as we have them in about 3-5% of all cases--need still conventional filmscreen or phosphorplate-systems. Full-spine and long-leg examinations cannot be performed with sufficient exactness. Without any questions implementation of integrated RIS, PACS and flatpanel detector-system needs excellent training of the employees, because of the changes in workflow etc. The main profits of such an integrated implementation are an increase in quality in image and report datas, easier handling--there are almost no more cassettes necessary--and excessive shortening of workflow.

  20. Workflow Automation: A Collective Case Study

    ERIC Educational Resources Information Center

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  1. Improving adherence to the Epic Beacon ambulatory workflow.

    PubMed

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  2. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    DOE PAGES

    Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...

    2015-01-01

    This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less

  3. Computer imaging and workflow systems in the business office.

    PubMed

    Adams, W T; Veale, F H; Helmick, P M

    1999-05-01

    Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.

  4. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    NASA Technical Reports Server (NTRS)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  5. Progress on Platforms, Sensors and Applications with Unmanned Aerial Vehicles in soil science and geomorphology

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Suomalainen, Juha; Seeger, Manuel; Keesstra, Saskia; Bartholomeus, Harm; Paron, Paolo

    2014-05-01

    The recent increase of performance and endurance of electronically controlled flying platforms, such as multi-copters and fixed-wing airplanes, and decreasing size and weight of different sensors and batteries leads to increasing popularity of Unmanned Aerial Systems (UAS) for scientific purposes. Modern workflows that implement UAS include guided flight plan generation, 3D GPS navigation for fully automated piloting, and automated processing with new techniques such as "Structure from Motion" photogrammetry. UAS are often equipped with normal RGB cameras, multi- and hyperspectral sensors, radar, or other sensors, and provide a cheap and flexible solution for creating multi-temporal data sets. UAS revolutionized multi-temporal research allowing new applications related to change analysis and process monitoring. The EGU General Assembly 2014 is hosting a session on platforms, sensors and applications with UAS in soil science and geomorphology. This presentation briefly summarizes the outcome of this session, addressing the current state and future challenges of small-platform data acquisition in soil science and geomorphology.

  6. On the Scene: Developing a Nurse Care Coordinator Role at City of Hope.

    PubMed

    Johnson, Shirley A; Giesie, Pamela D; Ireland, Anne M; Rice, Robert David; Thomson, Brenda K

    2016-01-01

    We describe the development of an oncology solid tumor disease-focused care coordination model. Consistent with our strategic plan to provide patient- and family-centered care and to organize care around disease management teams, we developed the role of nurse care coordinator as an integral team member in our care delivery model. Managing a defined high-risk patient population across the care trajectory, these nurses provide stable points of contact and continuity for patients and families as they navigate the complex treatments and systems required to deliver cancer care. We describe role delineation and staffing models; role clarity between the role of the nurse care coordinator and the case manager; core curriculum development; the use of workflow management tools to support the touch points of the patient and members of the care team; and the incorporation of electronic medical records and data streams to inform the care delivery model. We identify measures that we will use to evaluate the success of our program.

  7. Impact of digital radiography on clinical workflow.

    PubMed

    May, G A; Deer, D D; Dackiewicz, D

    2000-05-01

    It is commonly accepted that digital radiography (DR) improves workflow and patient throughput compared with traditional film radiography or computed radiography (CR). DR eliminates the film development step and the time to acquire the image from a CR reader. In addition, the wide dynamic range of DR is such that the technologist can perform the quality-control (QC) step directly at the modality in a few seconds, rather than having to transport the newly acquired image to a centralized QC station for review. Furthermore, additional workflow efficiencies can be achieved with DR by employing tight radiology information system (RIS) integration. In the DR imaging environment, this provides for patient demographic information to be automatically downloaded from the RIS to populate the DR Digital Imaging and Communications in Medicine (DICOM) image header. To learn more about this workflow efficiency improvement, we performed a comparative study of workflow steps under three different conditions: traditional film/screen x-ray, DR without RIS integration (ie, manual entry of patient demographics), and DR with RIS integration. This study was performed at the Cleveland Clinic Foundation (Cleveland, OH) using a newly acquired amorphous silicon flat-panel DR system from Canon Medical Systems (Irvine, CA). Our data show that DR without RIS results in substantial workflow savings over traditional film/screen practice. There is an additional 30% reduction in total examination time using DR with RIS integration.

  8. Planning, guidance, and quality assurance of pelvic screw placement using deformable image registration

    NASA Astrophysics Data System (ADS)

    Goerres, J.; Uneri, A.; Jacobson, M.; Ramsay, B.; De Silva, T.; Ketcha, M.; Han, R.; Manbachi, A.; Vogt, S.; Kleinszig, G.; Wolinsky, J.-P.; Osgood, G.; Siewerdsen, J. H.

    2017-12-01

    Percutaneous pelvic screw placement is challenging due to narrow bone corridors surrounded by vulnerable structures and difficult visual interpretation of complex anatomical shapes in 2D x-ray projection images. To address these challenges, a system for planning, guidance, and quality assurance (QA) is presented, providing functionality analogous to surgical navigation, but based on robust 3D-2D image registration techniques using fluoroscopy images already acquired in routine workflow. Two novel aspects of the system are investigated: automatic planning of pelvic screw trajectories and the ability to account for deformation of surgical devices (K-wire deflection). Atlas-based registration is used to calculate a patient-specific plan of screw trajectories in preoperative CT. 3D-2D registration aligns the patient to CT within the projective geometry of intraoperative fluoroscopy. Deformable known-component registration (dKC-Reg) localizes the surgical device, and the combination of plan and device location is used to provide guidance and QA. A leave-one-out analysis evaluated the accuracy of automatic planning, and a cadaver experiment compared the accuracy of dKC-Reg to rigid approaches (e.g. optical tracking). Surgical plans conformed within the bone cortex by 3-4 mm for the narrowest corridor (superior pubic ramus) and  >5 mm for the widest corridor (tear drop). The dKC-Reg algorithm localized the K-wire tip within 1.1 mm and 1.4° and was consistently more accurate than rigid-body tracking (errors up to 9 mm). The system was shown to automatically compute reliable screw trajectories and accurately localize deformed surgical devices (K-wires). Such capability could improve guidance and QA in orthopaedic surgery, where workflow is impeded by manual planning, conventional tool trackers add complexity and cost, rigid tool assumptions are often inaccurate, and qualitative interpretation of complex anatomy from 2D projections is prone to trial-and-error with extended fluoroscopy time.

  9. Contextual cloud-based service oriented architecture for clinical workflow.

    PubMed

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  10. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  11. Workflow computing. Improving management and efficiency of pathology diagnostic services.

    PubMed

    Buffone, G J; Moreau, D; Beck, J R

    1996-04-01

    Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.

  12. SU-E-J-73: Extension of a Clinical OIS/EMR/R&V System to Deliver Safe and Efficient Adaptive Plan-Of-The-Day Treatments Using a Fully Customizable Plan-Library-Based Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhiat, A.; Elekta, Sunnyvale, CA; Kanis, A.P.

    Purpose: To extend a clinical Record and Verify (R&V) system to enable a safe and fast workflow for Plan-of-the-Day (PotD) adaptive treatments based on patient-specific plan libraries. Methods: Plan libraries for PotD adaptive treatments contain for each patient several pre-treatment generated treatment plans. They may be generated for various patient anatomies or CTV-PTV margins. For each fraction, a Cone Beam CT scan is acquired to support the selection of the plan that best fits the patient’s anatomy-of-the-day. To date, there are no commercial R&V systems that support PotD delivery strategies. Consequently, the clinical workflow requires many manual interventions. Moreover, multiplemore » scheduled plans have a high risk of excessive dose delivery. In this work we extended a commercial R&V system (MOSAIQ) to support PotD workflows using IQ-scripting. The PotD workflow was designed after extensive risk analysis of the manual procedure, and all identified risks were incorporated as logical checks. Results: All manual PotD activities were automated. The workflow first identifies if the patient is scheduled for PotD, then performs safety checks, and continues to treatment plan selection only if no issues were found. The user selects the plan to deliver from a list of candidate plans. After plan selection, the workflow makes the treatment fields of the selected plan available for delivery by adding them to the treatment calendar. Finally, control is returned to the R&V system to commence treatment. Additional logic was added to incorporate off-line changes such as updating the plan library. After extensive testing including treatment fraction interrupts and plan-library updates during the treatment course, the workflow is running successfully in a clinical pilot, in which 35 patients have been treated since October 2014. Conclusion: We have extended a commercial R&V system for improved safety and efficiency in library-based adaptive strategies enabling a wide-spread implementation of those strategies. This work was in part funded by a research grant of Elekta AB, Stockholm, Sweden.« less

  13. Cadaveric feasibility study of da Vinci Si-assisted cochlear implant with augmented visual navigation for otologic surgery.

    PubMed

    Liu, Wen P; Azizian, Mahdi; Sorger, Jonathan; Taylor, Russell H; Reilly, Brian K; Cleary, Kevin; Preciado, Diego

    2014-03-01

    To our knowledge, this is the first reported cadaveric feasibility study of a master-slave-assisted cochlear implant procedure in the otolaryngology-head and neck surgery field using the da Vinci Si system (da Vinci Surgical System; Intuitive Surgical, Inc). We describe the surgical workflow adaptations using a minimally invasive system and image guidance integrating intraoperative cone beam computed tomography through augmented reality. To test the feasibility of da Vinci Si-assisted cochlear implant surgery with augmented reality, with visualization of critical structures and facilitation with precise cochleostomy for electrode insertion. Cadaveric case study of bilateral cochlear implant approaches conducted at Intuitive Surgical Inc, Sunnyvale, California. Bilateral cadaveric mastoidectomies, posterior tympanostomies, and cochleostomies were performed using the da Vinci Si system on a single adult human donor cadaveric specimen. Radiographic confirmation of successful cochleostomies, placement of a phantom cochlear implant wire, and visual confirmation of critical anatomic structures (facial nerve, cochlea, and round window) in augmented stereoendoscopy. With a surgical mean time of 160 minutes per side, complete bilateral cochlear implant procedures were successfully performed with no violation of critical structures, notably the facial nerve, chorda tympani, sigmoid sinus, dura, or ossicles. Augmented reality image overlay of the facial nerve, round window position, and basal turn of the cochlea was precise. Postoperative cone beam computed tomography scans confirmed successful placement of the phantom implant electrode array into the basal turn of the cochlea. To our knowledge, this is the first study in the otolaryngology-head and neck surgery literature examining the use of master-slave-assisted cochleostomy with augmented reality for cochlear implants using the da Vinci Si system. The described system for cochleostomy has the potential to improve the surgeon's confidence, as well as surgical safety, efficiency, and precision by filtering tremor. The integration of augmented reality may be valuable for surgeons dealing with complex cases of congenital anatomic abnormality, for revision cochlear implant with distorted anatomy and poorly pneumatized mastoids, and as a method of interactive teaching. Further research into the cost-benefit ratio of da Vinci Si-assisted otologic surgery, as well as refinements of the proposed workflow, are required before considering clinical studies.

  14. Schedule-Aware Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  15. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  16. Managing and Communicating Operational Workflow

    PubMed Central

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  17. Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow

    NASA Astrophysics Data System (ADS)

    Scianna, A.; La Guardia, M.; Scaduto, M. L.

    2016-06-01

    In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  18. How a surgeon becomes superman by visualization of intelligently fused multi-modalities

    NASA Astrophysics Data System (ADS)

    Erat, Okan; Pauly, Olivier; Weidert, Simon; Thaller, Peter; Euler, Ekkehard; Mutschler, Wolf; Navab, Nassir; Fallavollita, Pascal

    2013-03-01

    Motivation: The existing visualization of the Camera augmented mobile C-arm (CamC) system does not have enough cues for depth information and presents the anatomical information in a confusing way to surgeons. Methods: We propose a method that segments anatomical information from X-ray and then augment it on the video images. To provide depth cues, pixels belonging to video images are classified as skin and object classes. The augmentation of anatomical information from X-ray is performed only when pixels have a larger probability of belonging to skin class. Results: We tested our algorithm by displaying the new visualization to 2 expert surgeons and 1 medical student during three surgical workflow sequences of the interlocking of intramedullary nail procedure, namely: skin incision, center punching, and drilling. Via a survey questionnaire, they were asked to assess the new visualization when compared to the current alphablending overlay image displayed by CamC. The participants all agreed (100%) that occlusion and instrument tip position detection were immediately improved with our technique. When asked if our visualization has potential to replace the existing alpha-blending overlay during interlocking procedures, all participants did not hesitate to suggest an immediate integration of the visualization for the correct navigation and guidance of the procedure. Conclusion: Current alpha blending visualizations lack proper depth cues and can be a source of confusion for the surgeons when performing surgery. Our visualization concept shows great potential in alleviating occlusion and facilitating clinician understanding during specific workflow steps of the intramedullary nailing procedure.

  19. Standardizing clinical trials workflow representation in UML for international site comparison.

    PubMed

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-11-09

    With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.

  20. Standardizing Clinical Trials Workflow Representation in UML for International Site Comparison

    PubMed Central

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M. O.; Rodrigues, Maria J.; Shah, Jatin; Loures, Marco R.; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-01-01

    Background With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Methods Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Results Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. Conclusions This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows. PMID:21085484

  1. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    PubMed

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  2. Financial and workflow analysis of radiology reporting processes in the planning phase of implementation of a speech recognition system

    NASA Astrophysics Data System (ADS)

    Whang, Tom; Ratib, Osman M.; Umamoto, Kathleen; Grant, Edward G.; McCoy, Michael J.

    2002-05-01

    The goal of this study is to determine the financial value and workflow improvements achievable by replacing traditional transcription services with a speech recognition system in a large, university hospital setting. Workflow metrics were measured at two hospitals, one of which exclusively uses a transcription service (UCLA Medical Center), and the other which exclusively uses speech recognition (West Los Angeles VA Hospital). Workflow metrics include time spent per report (the sum of time spent interpreting, dictating, reviewing, and editing), transcription turnaround, and total report turnaround. Compared to traditional transcription, speech recognition resulted in radiologists spending 13-32% more time per report, but it also resulted in reduction of report turnaround time by 22-62% and reduction of marginal cost per report by 94%. The model developed here helps justify the introduction of a speech recognition system by showing that the benefits of reduced operating costs and decreased turnaround time outweigh the cost of increased time spent per report. Whether the ultimate goal is to achieve a financial objective or to improve operational efficiency, it is important to conduct a thorough analysis of workflow before implementation.

  3. Flexible workflow sharing and execution services for e-scientists

    NASA Astrophysics Data System (ADS)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.

  4. Integration of Earth System Models and Workflow Management under iRODS for the Northeast Regional Earth System Modeling Project

    NASA Astrophysics Data System (ADS)

    Lengyel, F.; Yang, P.; Rosenzweig, B.; Vorosmarty, C. J.

    2012-12-01

    The Northeast Regional Earth System Model (NE-RESM, NSF Award #1049181) integrates weather research and forecasting models, terrestrial and aquatic ecosystem models, a water balance/transport model, and mesoscale and energy systems input-out economic models developed by interdisciplinary research team from academia and government with expertise in physics, biogeochemistry, engineering, energy, economics, and policy. NE-RESM is intended to forecast the implications of planning decisions on the region's environment, ecosystem services, energy systems and economy through the 21st century. Integration of model components and the development of cyberinfrastructure for interacting with the system is facilitated with the integrated Rule Oriented Data System (iRODS), a distributed data grid that provides archival storage with metadata facilities and a rule-based workflow engine for automating and auditing scientific workflows.

  5. Evolution of Medication Administration Workflow in Implementing Electronic Health Record System

    ERIC Educational Resources Information Center

    Huang, Yuan-Han

    2013-01-01

    This study focused on the clinical workflow evolutions when implementing the health information technology (HIT). The study especially emphasized on administrating medication when the electronic health record (EHR) systems were adopted at rural healthcare facilities. Mixed-mode research methods, such as survey, observation, and focus group, were…

  6. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  7. Proving Value in Radiology: Experience Developing and Implementing a Shareable Open Source Registry Platform Driven by Radiology Workflow.

    PubMed

    Gichoya, Judy Wawira; Kohli, Marc D; Haste, Paul; Abigail, Elizabeth Mills; Johnson, Matthew S

    2017-10-01

    Numerous initiatives are in place to support value based care in radiology including decision support using appropriateness criteria, quality metrics like radiation dose monitoring, and efforts to improve the quality of the radiology report for consumption by referring providers. These initiatives are largely data driven. Organizations can choose to purchase proprietary registry systems, pay for software as a service solution, or deploy/build their own registry systems. Traditionally, registries are created for a single purpose like radiation dosage or specific disease tracking like diabetes registry. This results in a fragmented view of the patient, and increases overhead to maintain such single purpose registry system by requiring an alternative data entry workflow and additional infrastructure to host and maintain multiple registries for different clinical needs. This complexity is magnified in the health care enterprise whereby radiology systems usually are run parallel to other clinical systems due to the different clinical workflow for radiologists. In the new era of value based care where data needs are increasing with demand for a shorter turnaround time to provide data that can be used for information and decision making, there is a critical gap to develop registries that are more adapt to the radiology workflow with minimal overhead on resources for maintenance and setup. We share our experience of developing and implementing an open source registry system for quality improvement and research in our academic institution that is driven by our radiology workflow.

  8. Unrealized potential and residual consequences of electronic prescribing on pharmacy workflow in the outpatient pharmacy.

    PubMed

    Nanji, Karen C; Rothschild, Jeffrey M; Boehne, Jennifer J; Keohane, Carol A; Ash, Joan S; Poon, Eric G

    2014-01-01

    Electronic prescribing systems have often been promoted as a tool for reducing medication errors and adverse drug events. Recent evidence has revealed that adoption of electronic prescribing systems can lead to unintended consequences such as the introduction of new errors. The purpose of this study is to identify and characterize the unrealized potential and residual consequences of electronic prescribing on pharmacy workflow in an outpatient pharmacy. A multidisciplinary team conducted direct observations of workflow in an independent pharmacy and semi-structured interviews with pharmacy staff members about their perceptions of the unrealized potential and residual consequences of electronic prescribing systems. We used qualitative methods to iteratively analyze text data using a grounded theory approach, and derive a list of major themes and subthemes related to the unrealized potential and residual consequences of electronic prescribing. We identified the following five themes: Communication, workflow disruption, cost, technology, and opportunity for new errors. These contained 26 unique subthemes representing different facets of our observations and the pharmacy staff's perceptions of the unrealized potential and residual consequences of electronic prescribing. We offer targeted solutions to improve electronic prescribing systems by addressing the unrealized potential and residual consequences that we identified. These recommendations may be applied not only to improve staff perceptions of electronic prescribing systems but also to improve the design and/or selection of these systems in order to optimize communication and workflow within pharmacies while minimizing both cost and the potential for the introduction of new errors.

  9. DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed within the same environment, this setup demonstrates the flexibility of DIaaS to run applications across e-Infrastructures. In summary, DIaaS delivers specialized software to execute data-intensive applications in a scalable, efficient, and robust manner reducing the engineering time and computational cost.

  10. Modeling workflow to design machine translation applications for public health practice

    PubMed Central

    Turner, Anne M.; Brownstein, Megumu K.; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2014-01-01

    Objective Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). Materials and Methods We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. Results The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. Discussion This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. Counclusion The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. PMID:25445922

  11. Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.

    PubMed

    Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M

    2017-01-01

    Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.

  12. 33 CFR 62.51 - Western Rivers Marking System.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....51 Section 62.51 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.51 Western Rivers Marking System. (a) A variation of the standard U.S. aids to navigation system described above is employed...

  13. 33 CFR 62.51 - Western Rivers Marking System.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....51 Section 62.51 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.51 Western Rivers Marking System. (a) A variation of the standard U.S. aids to navigation system described above is employed...

  14. 33 CFR 62.51 - Western Rivers Marking System.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....51 Section 62.51 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.51 Western Rivers Marking System. (a) A variation of the standard U.S. aids to navigation system described above is employed...

  15. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  16. Lessons from implementing a combined workflow-informatics system for diabetes management.

    PubMed

    Zai, Adrian H; Grant, Richard W; Estey, Greg; Lester, William T; Andrews, Carl T; Yee, Ronnie; Mort, Elizabeth; Chueh, Henry C

    2008-01-01

    Shortcomings surrounding the care of patients with diabetes have been attributed largely to a fragmented, disorganized, and duplicative health care system that focuses more on acute conditions and complications than on managing chronic disease. To address these shortcomings, we developed a diabetes registry population management application to change the way our staff manages patients with diabetes. Use of this new application has helped us coordinate the responsibilities for intervening and monitoring patients in the registry among different users. Our experiences using this combined workflow-informatics intervention system suggest that integrating a chronic disease registry into clinical workflow for the treatment of chronic conditions creates a useful and efficient tool for managing disease.

  17. Development and Appraisal of Multiple Accounting Record System (Mars).

    PubMed

    Yu, H C; Chen, M C

    2016-01-01

    The aim of the system is to achieve simplification of workflow, reduction of recording time, and increase the income for the study hospital. The project team decided to develop a multiple accounting record system that generates the account records based on the nursing records automatically, reduces the time and effort for nurses to review the procedure and provide another note of material consumption. Three configuration files were identified to demonstrate the relationship of treatments and reimbursement items. The workflow was simplified. The nurses averagely reduced 10 minutes of daily recording time, and the reimbursement points have been increased by 7.49%. The project streamlined the workflow and provides the institute a better way in finical management.

  18. A Drupal-Based Collaborative Framework for Science Workflows

    NASA Astrophysics Data System (ADS)

    Pinheiro da Silva, P.; Gandara, A.

    2010-12-01

    Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between scientists about artifacts used or created through scientific processes; and to leverage the knowledge collected within the artifacts and scientific collaborations to support scientific discoveries.

  19. Science Gateways, Scientific Workflows and Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.

  20. Addressing informatics challenges in Translational Research with workflow technology.

    PubMed

    Beaulah, Simon A; Correll, Mick A; Munro, Robin E J; Sheldon, Jonathan G

    2008-09-01

    Interest in Translational Research has been growing rapidly in recent years. In this collision of different data, technologies and cultures lie tremendous opportunities for the advancement of science and business for organisations that are able to integrate, analyse and deliver this information effectively to users. Workflow-based integration and analysis systems are becoming recognised as a fast and flexible way to build applications that are tailored to scientific areas, yet are built on a common platform. Workflow systems are allowing organisations to meet the key informatics challenges in Translational Research and improve disease understanding and patient care.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  2. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  3. Provenance Datasets Highlighting Capture Disparities

    DTIC Science & Technology

    2014-01-01

    Vistrails [20], Taverna [21] or Kepler [6], and an OS -observing system like PASS [18]. In less granular workflow systems, the data files, scripts...run, etc. are capturable as long as they are executed within the workflow system. In more granular OS -observing systems, the actual reads, writes...rolling up” very granular information to less granular information. OS -level capture knows that a socket was opened and that data was sent to a foreign

  4. Real-Time System for Water Modeling and Management

    NASA Astrophysics Data System (ADS)

    Lee, J.; Zhao, T.; David, C. H.; Minsker, B.

    2012-12-01

    Working closely with the Texas Commission on Environmental Quality (TCEQ) and the University of Texas at Austin (UT-Austin), we are developing a real-time system for water modeling and management using advanced cyberinfrastructure, data integration and geospatial visualization, and numerical modeling. The state of Texas suffered a severe drought in 2011 that cost the state $7.62 billion in agricultural losses (crops and livestock). Devastating situations such as this could potentially be avoided with better water modeling and management strategies that incorporate state of the art simulation and digital data integration. The goal of the project is to prototype a near-real-time decision support system for river modeling and management in Texas that can serve as a national and international model to promote more sustainable and resilient water systems. The system uses National Weather Service current and predicted precipitation data as input to the Noah-MP Land Surface model, which forecasts runoff, soil moisture, evapotranspiration, and water table levels given land surface features. These results are then used by a river model called RAPID, along with an error model currently under development at UT-Austin, to forecast stream flows in the rivers. Model forecasts are visualized as a Web application for TCEQ decision makers, who issue water diversion (withdrawal) permits and any needed drought restrictions; permit holders; and reservoir operation managers. Users will be able to adjust model parameters to predict the impacts of alternative curtailment scenarios or weather forecasts. A real-time optimization system under development will help TCEQ to identify optimal curtailment strategies to minimize impacts on permit holders and protect health and safety. To develop the system we have implemented RAPID as a remotely-executed modeling service using the Cyberintegrator workflow system with input data downloaded from the North American Land Data Assimilation System. The Cyberintegrator workflow system provides RESTful web services for users to provide inputs, execute workflows, and retrieve outputs. Along with REST endpoints, PAW (Publishable Active Workflows) provides the web user interface toolkit for us to develop web applications with scientific workflows. The prototype web application is built on top of workflows with PAW, so that users will have a user-friendly web environment to provide input parameters, execute the model, and visualize/retrieve the results using geospatial mapping tools. In future work the optimization model will be developed and integrated into the workflow.; Real-Time System for Water Modeling and Management

  5. A Scheduling Algorithm for the Distributed Student Registration System in Transaction-Intensive Environment

    ERIC Educational Resources Information Center

    Li, Wenhao

    2011-01-01

    Distributed workflow technology has been widely used in modern education and e-business systems. Distributed web applications have shown cross-domain and cooperative characteristics to meet the need of current distributed workflow applications. In this paper, the author proposes a dynamic and adaptive scheduling algorithm PCSA (Pre-Calculated…

  6. Real-time MRI guidance of cardiac interventions.

    PubMed

    Campbell-Washburn, Adrienne E; Tavallaei, Mohammad A; Pop, Mihaela; Grant, Elena K; Chubb, Henry; Rhode, Kawal; Wright, Graham A

    2017-10-01

    Cardiac magnetic resonance imaging (MRI) is appealing to guide complex cardiac procedures because it is ionizing radiation-free and offers flexible soft-tissue contrast. Interventional cardiac MR promises to improve existing procedures and enable new ones for complex arrhythmias, as well as congenital and structural heart disease. Guiding invasive procedures demands faster image acquisition, reconstruction and analysis, as well as intuitive intraprocedural display of imaging data. Standard cardiac MR techniques such as 3D anatomical imaging, cardiac function and flow, parameter mapping, and late-gadolinium enhancement can be used to gather valuable clinical data at various procedural stages. Rapid intraprocedural image analysis can extract and highlight critical information about interventional targets and outcomes. In some cases, real-time interactive imaging is used to provide a continuous stream of images displayed to interventionalists for dynamic device navigation. Alternatively, devices are navigated relative to a roadmap of major cardiac structures generated through fast segmentation and registration. Interventional devices can be visualized and tracked throughout a procedure with specialized imaging methods. In a clinical setting, advanced imaging must be integrated with other clinical tools and patient data. In order to perform these complex procedures, interventional cardiac MR relies on customized equipment, such as interactive imaging environments, in-room image display, audio communication, hemodynamic monitoring and recording systems, and electroanatomical mapping and ablation systems. Operating in this sophisticated environment requires coordination and planning. This review provides an overview of the imaging technology used in MRI-guided cardiac interventions. Specifically, this review outlines clinical targets, standard image acquisition and analysis tools, and the integration of these tools into clinical workflow. 1 Technical Efficacy: Stage 5 J. Magn. Reson. Imaging 2017;46:935-950. © 2017 International Society for Magnetic Resonance in Medicine.

  7. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  8. Clinical indications for high-field 1.5 T intraoperative magnetic resonance imaging and neuro-navigation for neurosurgical procedures. Review of initial 100 cases.

    PubMed

    Maesawa, Satoshi; Fujii, Masazumi; Nakahara, Norimoto; Watanabe, Tadashi; Saito, Kiyoshi; Kajita, Yasukazu; Nagatani, Tetsuya; Wakabayashi, Toshihiko; Yoshida, Jun

    2009-08-01

    Initial experiences are reviewed in an integrated operation theater equipped with an intraoperative high-field (1.5 T) magnetic resonance (MR) imager and neuro-navigation (BrainSUITE), to evaluate the indications and limitations. One hundred consecutive cases were treated, consisting of 38 gliomas, 49 other tumors, 11 cerebrovascular diseases, and 2 functional diseases. The feasibility and usefulness of the integrated theater were evaluated for individual diseases, focusing on whether intraoperative images (including diffusion tensor imaging) affected the surgical strategy. The extent of resection and outcomes in each histological category of brain tumors were examined. Intraoperative high-field MR imaging frequently affected or modified the surgical strategy in the glioma group (27/38 cases, 71.1%), but less in the other tumor group (13/49 cases, 26.5%). The surgical strategy was not modified in cerebrovascular or functional diseases, but the success of procedures and the absence of complications could be confirmed. In glioma surgery, subtotal or greater resection was achieved in 22 of the 31 patients (71%) excluding biopsies, and intraoperative images revealed tumor remnants resulting in the extension of resection in 21 of the 22 patients (95.4%), the highest rate of extension among all types of pathologies. The integrated neuro-navigation improved workflow. The best indication for intraoperative high-field MR imaging and integrated neuro-navigation is brain tumors, especially gliomas, and is supplementary in assuring quality in surgery for cerebrovascular or functional diseases. Immediate quality assurance is provided in several types of neurosurgical procedures.

  9. University of TX Bureau of Economic Geology's Core Research Centers: The Time is Right for Registering Physical Samples and Assigning IGSN's - Workflows, Stumbling Blocks, and Successes.

    NASA Astrophysics Data System (ADS)

    Averett, A.; DeJarnett, B. B.

    2016-12-01

    The University Of Texas Bureau Of Economic Geology (BEG) serves as the geological survey for Texas and operates three geological sample repositories that house well over 2 million boxes of geological samples (cores and cuttings) and an abundant amount of geoscience data (geophysical logs, thin sections, geochemical analyses, etc.). Material is accessible and searchable online, and it is publically available to the geological community for research and education. Patrons access information about our collection by using our online core and log database (SQL format). BEG is currently undertaking a large project to: 1) improve the internal accuracy of metadata associated with the collection; 2) enhance the capabilities of the database for both BEG curators and researchers as well as our external patrons; and 3) ensure easy and efficient navigation for patrons through our online portal. As BEG undertakes this project, BEG is in the early stages of planning to export the metadata for its collection into SESAR (System for Earth Sample Registration) and have IGSN's (International GeoSample Numbers) assigned to its samples. Education regarding the value of IGSN's and an external registry (SESAR) has been crucial to receiving management support for the project because the concept and potential benefits of registering samples in a registry outside of the institution were not well-known prior to this project. Potential benefits such as increases in discoverability, repository recognition in publications, and interoperability were presented. The project was well-received by management, and BEG fully supports the effort to register our physical samples with SESAR. Since BEG is only in the initial phase of this project, any stumbling blocks, workflow issues, successes/failures, etc. can only be predicted at this point, but by mid-December, BEG expects to have several concrete issues to present in the session. Currently, our most pressing issue involves establishing the most efficient workflow for exporting of large amounts of metadata in a format that SESAR can easily ingest, and how this can be best accomplished with very few BEG staff assigned to the project.

  10. [Measures to prevent patient identification errors in blood collection/physiological function testing utilizing a laboratory information system].

    PubMed

    Shimazu, Chisato; Hoshino, Satoshi; Furukawa, Taiji

    2013-08-01

    We constructed an integrated personal identification workflow chart using both bar code reading and an all in-one laboratory information system. The information system not only handles test data but also the information needed for patient guidance in the laboratory department. The reception terminals at the entrance, displays for patient guidance and patient identification tools at blood-sampling booths are all controlled by the information system. The number of patient identification errors was greatly reduced by the system. However, identification errors have not been abolished in the ultrasound department. After re-evaluation of the patient identification process in this department, we recognized that the major reason for the errors came from excessive identification workflow. Ordinarily, an ultrasound test requires patient identification 3 times, because 3 different systems are required during the entire test process, i.e. ultrasound modality system, laboratory information system and a system for producing reports. We are trying to connect the 3 different systems to develop a one-time identification workflow, but it is not a simple task and has not been completed yet. Utilization of the laboratory information system is effective, but is not yet perfect for patient identification. The most fundamental procedure for patient identification is to ask a person's name even today. Everyday checks in the ordinary workflow and everyone's participation in safety-management activity are important for the prevention of patient identification errors.

  11. Visualizing Cross-sectional Data in a Real-World Context

    NASA Astrophysics Data System (ADS)

    Van Noten, K.; Lecocq, T.

    2016-12-01

    If you could fly around your research results in three dimensions, wouldn't you like to do it? Visualizing research results properly during scientific presentations already does half the job of informing the public on the geographic framework of your research. Many scientists use the Google Earth™ mapping service (V7.1.2.2041) because it's a great interactive mapping tool for assigning geographic coordinates to individual data points, localizing a research area, and draping maps of results over Earth's surface for 3D visualization. However, visualizations of research results in vertical cross-sections are often not shown simultaneously with the maps in Google Earth. A few tutorials and programs to display cross-sectional data in Google Earth do exist, and the workflow is rather simple. By importing a cross-sectional figure into in the open software SketchUp Make [Trimble Navigation Limited, 2016], any spatial model can be exported to a vertical figure in Google Earth. In this presentation a clear workflow/tutorial is presented how to image cross-sections manually in Google Earth. No software skills, nor any programming codes are required. It is very easy to use, offers great possibilities for teaching and allows fast figure manipulation in Google Earth. The full workflow can be found in "Van Noten, K. 2016. Visualizing Cross-Sectional Data in a Real-World Context. EOS, Transactions AGU, 97, 16-19".The video tutorial can be found here: https://www.youtube.com/watch?v=Tr8LwFJ4RYU&Figure: Cross-sectional Research Examples Illustrated in Google Earth

  12. Patient-specific polyetheretherketone facial implants in a computer-aided planning workflow.

    PubMed

    Guevara-Rojas, Godoberto; Figl, Michael; Schicho, Kurt; Seemann, Rudolf; Traxler, Hannes; Vacariu, Apostolos; Carbon, Claus-Christian; Ewers, Rolf; Watzinger, Franz

    2014-09-01

    In the present study, we report an innovative workflow using polyetheretherketone (PEEK) patient-specific implants for esthetic corrections in the facial region through onlay grafting. The planning includes implant design according to virtual osteotomy and generation of a subtraction volume. The implant design was refined by stepwise changing the implant geometry according to soft tissue simulations. One patient was scanned using computed tomography. PEEK implants were interactively designed and manufactured using rapid prototyping techniques. Positioning intraoperatively was assisted by computer-aided navigation. Two months after surgery, a 3-dimensional surface model of the patient's face was generated using photogrammetry. Finally, the Hausdorff distance calculation was used to quantify the overall error, encompassing the failures in soft tissue simulation and implantation. The implant positioning process during surgery was satisfactory. The simulated soft tissue surface and the photogrammetry scan of the patient showed a high correspondence, especially where the skin covered the implants. The mean total error (Hausdorff distance) was 0.81 ± 1.00 mm (median 0.48, interquartile range 1.11). The spatial deviation remained less than 0.7 mm for the vast majority of points. The proposed workflow provides a complete computer-aided design, computer-aided manufacturing, and computer-aided surgery chain for implant design, allowing for soft tissue simulation, fabrication of patient-specific implants, and image-guided surgery to position the implants. Much of the surgical complexity resulting from osteotomies of the zygoma, chin, or mandibular angle might be transferred into the planning phase of patient-specific implants. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Barriers to effective, safe communication and workflow between nurses and non-consultant hospital doctors during out-of-hours.

    PubMed

    Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh

    2017-11-01

    This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  14. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements.

    PubMed

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-04-09

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.

  15. Navigation for space shuttle approach and landing using an inertial navigation system augmented by data from a precision ranging system or a microwave scan beam landing guidance system

    NASA Technical Reports Server (NTRS)

    Mcgee, L. A.; Smith, G. L.; Hegarty, D. M.; Merrick, R. B.; Carson, T. M.; Schmidt, S. F.

    1970-01-01

    A preliminary study has been made of the navigation performance which might be achieved for the high cross-range space shuttle orbiter during final approach and landing by using an optimally augmented inertial navigation system. Computed navigation accuracies are presented for an on-board inertial navigation system augmented (by means of an optimal filter algorithm) with data from two different ground navigation aids; a precision ranging system and a microwave scanning beam landing guidance system. These results show that augmentation with either type of ground navigation aid is capable of providing a navigation performance at touchdown which should be adequate for the space shuttle. In addition, adequate navigation performance for space shuttle landing is obtainable from the precision ranging system even with a complete dropout of precision range measurements as much as 100 seconds before touchdown.

  16. From chart tracking to workflow management.

    PubMed Central

    Srinivasan, P.; Vignes, G.; Venable, C.; Hazelwood, A.; Cade, T.

    1994-01-01

    The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system. PMID:7950051

  17. INL Autonomous Navigation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  18. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    PubMed

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  19. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements

    PubMed Central

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-01-01

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549

  20. Confidentiality Protection of User Data and Adaptive Resource Allocation for Managing Multiple Workflow Performance in Service-Based Systems

    ERIC Educational Resources Information Center

    An, Ho

    2012-01-01

    In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in…

  1. Changes in the cardiac rehabilitation workflow process needed for the implementation of a self-management system.

    PubMed

    Wiggers, Anne-Marieke; Vosbergen, Sandra; Kraaijenhagen, Roderik; Jaspers, Monique; Peek, Niels

    2013-01-01

    E-health interventions are of a growing importance for self-management of chronic conditions. This study aimed to describe the process adaptions that are needed in cardiac rehabilitation (CR) to implement a self-management system, called MyCARDSS. We created a generic workflow model based on interviews and observations at three CR clinics. Subsequently, a workflow model of the ideal situation after implementation of MyCARDSS was created. We found that the implementation will increase the complexity of existing working procedures because 1) not all patients will use MyCARDSS, 2) there is a transfer of tasks and responsibilities from professionals to patients, and 3) information in MyCARDSS needs to be synchronized with the EPR system for professionals.

  2. Flexible add-on solution for MR image-guided interventions in a closed-bore scanner environment.

    PubMed

    Busse, Harald; Garnov, Nikita; Thörmer, Gregor; Zajonz, Dirk; Gründer, Wilfried; Kahn, Thomas; Moche, Michael

    2010-09-01

    MRI is of great clinical utility for the guidance of various diagnostic and therapeutic procedures. In a standard closed-bore scanner, the simplest approach is to manipulate the instrument outside the bore and move the patient into the bore for reference and control imaging only. Without navigational assistance, however, such an approach can be difficult, inaccurate, and time consuming. Therefore, an add-on navigation solution is described that addresses these limitations. Patient registration is established by an automatic, robust, and fast (<30 sec) localization of table-mounted MR reference markers and the instrument is tracked optically. Good hand-eye coordination is provided by following the virtual instrument on MR images that are reconstructed in real time from the reference data. Needle displacements of 2.2 +/- 0.6 mm and 3.9 +/- 2.4 mm were determined in a phantom (P < 0.05), depending on whether the reference markers were placed at smaller (98-139 mm) or larger (147-188 mm) distances from the isocenter. Clinical functionality of the navigation concept is demonstrated by a double oblique, subscapular hook-wire insertion in a patient with a body mass index of 30.1 kg/m(2). Ease of use, compactness, and flexibility of this technique suggest that it can be used for many other procedures in different body regions. More patient cases are needed to evaluate clinical performance and workflow. 2010 Wiley-Liss, Inc.

  3. A Novel Augmented Reality Navigation System for Endoscopic Sinus and Skull Base Surgery: A Feasibility Study

    PubMed Central

    Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei

    2016-01-01

    Objective To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. Materials and Methods In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. Results The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. Conclusion The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon’s skills and knowledge, not as a substitute. PMID:26757365

  4. A Novel Augmented Reality Navigation System for Endoscopic Sinus and Skull Base Surgery: A Feasibility Study.

    PubMed

    Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei

    2016-01-01

    To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon's skills and knowledge, not as a substitute.

  5. A practical workflow for making anatomical atlases for biological research.

    PubMed

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  6. Separating Business Logic from Medical Knowledge in Digital Clinical Workflows Using Business Process Model and Notation and Arden Syntax.

    PubMed

    de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea

    2018-01-01

    Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.

  7. Highlights of X-Stack ExM Deliverable: MosaStore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripeanu, Matei

    2016-07-20

    This brief report highlights the experience gained with MosaStore, an exploratory part of the X-Stack project “ExM: System support for extreme-scale, many-task applications”. The ExM project proposed to use concurrent workflows supported by the Swift language and runtime as an innovative programming model to exploit parallelism in exascale computers. MosaStore aims to support this endeavor by improving storage support for workflow-based applications, more precisely by exploring the gains that can be obtained from co-designing the storage system and the workflow runtime engine. MosaStore has been developed primarily at the University of British Columbia.

  8. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    PubMed

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  9. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    PubMed Central

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  10. Relative navigation requirements for automatic rendezvous and capture systems

    NASA Technical Reports Server (NTRS)

    Kachmar, Peter M.; Polutchko, Robert J.; Chu, William; Montez, Moises

    1991-01-01

    This paper will discuss in detail the relative navigation system requirements and sensor trade-offs for Automatic Rendezvous and Capture. Rendezvous navigation filter development will be discussed in the context of navigation performance requirements for a 'Phase One' AR&C system capability. Navigation system architectures and the resulting relative navigation performance for both cooperative and uncooperative target vehicles will be assessed. Relative navigation performance using rendezvous radar, star tracker, radiometric, laser and GPS navigation sensors during appropriate phases of the trajectory will be presented. The effect of relative navigation performance on the Integrated AR&C system performance will be addressed. Linear covariance and deterministic simulation results will be used. Evaluation of relative navigation and IGN&C system performance for several representative relative approach profiles will be presented in order to demonstrate the full range of system capabilities. A summary of the sensor requirements and recommendations for AR&C system capabilities for several programs requiring AR&C will be presented.

  11. Apollo Onboard Navigation Techniques

    NASA Technical Reports Server (NTRS)

    Interbartolo, Michael

    2009-01-01

    This viewgraph presentation reviews basic navigation concepts, describes coordinate systems and identifies attitude determination techniques including Primary Guidance, Navigation and Control System (PGNCS) IMU management and Command and Service Module Stabilization and Control System/Lunar Module (LM) Abort Guidance System (AGS) attitude management. The presentation also identifies state vector determination techniques, including PGNCS coasting flight navigation, PGNCS powered flight navigation and LM AGS navigation.

  12. 33 CFR 62.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM General § 62.1 Purpose. (a) The Coast Guard administers the U.S. Aids to Navigation System. The system consists of Federal aids to navigation operated by the Coast Guard, aids to...

  13. 33 CFR 62.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM General § 62.1 Purpose. (a) The Coast Guard administers the U.S. Aids to Navigation System. The system consists of Federal aids to navigation operated by the Coast Guard, aids to...

  14. 33 CFR 62.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM General § 62.1 Purpose. (a) The Coast Guard administers the U.S. Aids to Navigation System. The system consists of Federal aids to navigation operated by the Coast Guard, aids to...

  15. 33 CFR 62.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM General § 62.1 Purpose. (a) The Coast Guard administers the U.S. Aids to Navigation System. The system consists of Federal aids to navigation operated by the Coast Guard, aids to...

  16. 33 CFR 62.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM General § 62.1 Purpose. (a) The Coast Guard administers the U.S. Aids to Navigation System. The system consists of Federal aids to navigation operated by the Coast Guard, aids to...

  17. COSMOS: Python library for massively parallel workflows

    PubMed Central

    Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.

    2014-01-01

    Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428

  18. COSMOS: Python library for massively parallel workflows.

    PubMed

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  19. Structuring clinical workflows for diabetes care: an overview of the OntoHealth approach.

    PubMed

    Schweitzer, M; Lasierra, N; Oberbichler, S; Toma, I; Fensel, A; Hoerbst, A

    2014-01-01

    Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view.

  20. Structuring Clinical Workflows for Diabetes Care

    PubMed Central

    Lasierra, N.; Oberbichler, S.; Toma, I.; Fensel, A.; Hoerbst, A.

    2014-01-01

    Summary Background Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. Objectives The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. Methods A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. Results This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. Conclusions The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view. PMID:25024765

  1. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  2. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  3. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  4. Three-dimensional virtual bone bank system for selecting massive bone allograft in orthopaedic oncology.

    PubMed

    Wu, Zhigang; Fu, Jun; Wang, Zhen; Li, Xiangdong; Li, Jing; Pei, Yanjun; Pei, Guoxian; Li, Dan; Guo, Zheng; Fan, Hongbin

    2015-06-01

    Although structural bone allografts have been used for years to treat large defects caused by tumour or trauma, selecting the most appropriate allograft is still challenging. The objectives of this study were to: (1) describe the establishment of a visual bone bank system and workflow of allograft selection, and (2) show mid-term follow-up results of patients after allograft implantation. Allografts were scanned and stored in Digital Imaging and Communications in Medicine (DICOM) files. Then, image segmentation was conducted and 3D model reconstructed to establish a visual bone bank system. Based on the volume registration method, allografts were selected after a careful matching process. From November 2010 to June 2013, with the help of the Computer-assisted Orthopaedic Surgery (CAOS) navigation system, the allografts were implanted in 14 patients to fill defects after tumour resection. By combining the virtual bone bank and CAOS, selection time was reduced and matching accuracy was increased. After 27.5 months of follow-up, the mean Musculoskeletal Tumor Society (MSTS) 93 functional score was 25.7 ± 1.1 points. Except for two patients with pulmonary metastases, 12 patents were alive without evidence of disease at the time this report was written. The virtual bone bank system was helpful for allograft selection, tumour excision and bone reconstruction, thereby improving the safety and effectiveness of limb-salvage surgery.

  5. Towards an intelligent hospital environment: OR of the future.

    PubMed

    Sutherland, Jeffrey V; van den Heuvel, Willem-Jan; Ganous, Tim; Burton, Matthew M; Kumar, Animesh

    2005-01-01

    Patients, providers, payers, and government demand more effective and efficient healthcare services, and the healthcare industry needs innovative ways to re-invent core processes. Business process reengineering (BPR) showed adopting new hospital information systems can leverage this transformation and workflow management technologies can automate process management. Our research indicates workflow technologies in healthcare require real time patient monitoring, detection of adverse events, and adaptive responses to breakdown in normal processes. Adaptive workflow systems are rarely implemented making current workflow implementations inappropriate for healthcare. The advent of evidence based medicine, guideline based practice, and better understanding of cognitive workflow combined with novel technologies including Radio Frequency Identification (RFID), mobile/wireless technologies, internet workflow, intelligent agents, and Service Oriented Architectures (SOA) opens up new and exciting ways of automating business processes. Total situational awareness of events, timing, and location of healthcare activities can generate self-organizing change in behaviors of humans and machines. A test bed of a novel approach towards continuous process management was designed for the new Weinburg Surgery Building at the University of Maryland Medical. Early results based on clinical process mapping and analysis of patient flow bottlenecks demonstrated 100% improvement in delivery of supplies and instruments at surgery start time. This work has been directly applied to the design of the DARPA Trauma Pod research program where robotic surgery will be performed on wounded soldiers on the battlefield.

  6. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGES

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  7. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Koo, Michelle; Cao, Yu

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less

  8. Planning bioinformatics workflows using an expert system.

    PubMed

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  10. A Web Interface for Eco System Modeling

    NASA Astrophysics Data System (ADS)

    McHenry, K.; Kooper, R.; Serbin, S. P.; LeBauer, D. S.; Desai, A. R.; Dietze, M. C.

    2012-12-01

    We have developed the Predictive Ecosystem Analyzer (PEcAn) as an open-source scientific workflow system and ecoinformatics toolbox that manages the flow of information in and out of regional-scale terrestrial biosphere models, facilitates heterogeneous data assimilation, tracks data provenance, and enables more effective feedback between models and field research. The over-arching goal of PEcAn is to make otherwise complex analyses transparent, repeatable, and accessible to a diverse array of researchers, allowing both novice and expert users to focus on using the models to examine complex ecosystems rather than having to deal with complex computer system setup and configuration questions in order to run the models. Through the developed web interface we hide much of the data and model details and allow the user to simply select locations, ecosystem models, and desired data sources as inputs to the model. Novice users are guided by the web interface through setting up a model execution and plotting the results. At the same time expert users are given enough freedom to modify specific parameters before the model gets executed. This will become more important as more and more models are added to the PEcAn workflow as well as more and more data that will become available as NEON comes online. On the backend we support the execution of potentially computationally expensive models on different High Performance Computers (HPC) and/or clusters. The system can be configured with a single XML file that gives it the flexibility needed for configuring and running the different models on different systems using a combination of information stored in a database as well as pointers to files on the hard disk. While the web interface usually creates this configuration file, expert users can still directly edit it to fine tune the configuration.. Once a workflow is finished the web interface will allow for the easy creation of plots over result data while also allowing the user to download the results for further processing. The current workflow in the web interface is a simple linear workflow, but will be expanded to allow for more complex workflows. We are working with Kepler and Cyberintegrator to allow for these more complex workflows as well as collecting provenance of the workflow being executed. This provenance regarding model executions is stored in a database along with the derived results. All of this information is then accessible using the BETY database web frontend. The PEcAn interface.

  11. Development of an autonomous treatment planning strategy for radiation therapy with effective use of population-based prior data.

    PubMed

    Wang, Huan; Dong, Peng; Liu, Hongcheng; Xing, Lei

    2017-02-01

    Current treatment planning remains a costly and labor intensive procedure and requires multiple trial-and-error adjustments of system parameters such as the weighting factors and prescriptions. The purpose of this work is to develop an autonomous treatment planning strategy with effective use of prior knowledge and in a clinically realistic treatment planning platform to facilitate radiation therapy workflow. Our technique consists of three major components: (i) a clinical treatment planning system (TPS); (ii) a formulation of decision-function constructed using an assemble of prior treatment plans; (iii) a plan evaluator or decision-function and an outer-loop optimization independent of the clinical TPS to assess the TPS-generated plan and to drive the search toward a solution optimizing the decision-function. Microsoft (MS) Visual Studio Coded UI is applied to record some common planner-TPS interactions as subroutines for querying and interacting with the TPS. These subroutines are called back in the outer-loop optimization program to navigate the plan selection process through the solution space iteratively. The utility of the approach is demonstrated by using clinical prostate and head-and-neck cases. An autonomous treatment planning technique with effective use of an assemble of prior treatment plans is developed to automatically maneuver the clinical treatment planning process in the platform of a commercial TPS. The process mimics the decision-making process of a human planner and provides a clinically sensible treatment plan automatically, thus reducing/eliminating the tedious manual trial-and-errors of treatment planning. It is found that the prostate and head-and-neck treatment plans generated using the approach compare favorably with that used for the patients' actual treatments. Clinical inverse treatment planning process can be automated effectively with the guidance of an assemble of prior treatment plans. The approach has the potential to significantly improve the radiation therapy workflow. © 2016 American Association of Physicists in Medicine.

  12. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using "service casts" and "interest casts" (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH's Mining Workflow Composer and the open-source Active BPEL engine, and JPL's SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the "sociological" problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  13. Identifying impact of software dependencies on replicability of biomedical workflows.

    PubMed

    Miksa, Tomasz; Rauber, Andreas; Mina, Eleni

    2016-12-01

    Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  15. A navigation system for the visually impaired using colored navigation lines and RFID tags.

    PubMed

    Seto, First Tatsuya

    2009-01-01

    In this paper, we describe about a developed navigation system that supports the independent walking of the visually impaired in the indoor space. Our developed instrument consists of a navigation system and a map information system. These systems are installed on a white cane. Our navigation system can follow a colored navigation line that is set on the floor. In this system, a color sensor installed on the tip of a white cane senses the colored navigation line, and the system informs the visually impaired that he/she is walking along the navigation line by vibration. The color recognition system is controlled by a one-chip microprocessor and this system can discriminate 6 colored navigation lines. RFID tags and a receiver for these tags are used in the map information system. The RFID tags and the RFID tag receiver are also installed on a white cane. The receiver receives tag information and notifies map information to the user by mp3 formatted pre-recorded voice. Three normal subjects who were blindfolded with an eye mask were tested with this system. All of them were able to walk along the navigation line. The performance of the map information system was good. Therefore, our system will be extremely valuable in supporting the activities of the visually impaired.

  16. 33 CFR 62.63 - Recommendations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Recommendations. 62.63 Section 62.63 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM Public Participation in the Aids to Navigation System § 62.63...

  17. Using conceptual work products of health care to design health IT.

    PubMed

    Berry, Andrew B L; Butler, Keith A; Harrington, Craig; Braxton, Melissa O; Walker, Amy J; Pete, Nikki; Johnson, Trevor; Oberle, Mark W; Haselkorn, Jodie; Paul Nichol, W; Haselkorn, Mark

    2016-02-01

    This paper introduces a new, model-based design method for interactive health information technology (IT) systems. This method extends workflow models with models of conceptual work products. When the health care work being modeled is substantially cognitive, tacit, and complex in nature, graphical workflow models can become too complex to be useful to designers. Conceptual models complement and simplify workflows by providing an explicit specification for the information product they must produce. We illustrate how conceptual work products can be modeled using standard software modeling language, which allows them to provide fundamental requirements for what the workflow must accomplish and the information that a new system should provide. Developers can use these specifications to envision how health IT could enable an effective cognitive strategy as a workflow with precise information requirements. We illustrate the new method with a study conducted in an outpatient multiple sclerosis (MS) clinic. This study shows specifically how the different phases of the method can be carried out, how the method allows for iteration across phases, and how the method generated a health IT design for case management of MS that is efficient and easy to use. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Terrain matching image pre-process and its format transform in autonomous underwater navigation

    NASA Astrophysics Data System (ADS)

    Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang

    2007-06-01

    Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its matching precision directly influences the final precision of integrated navigation system. Image matching assistant navigation is spatially matching and aiming at two underwater scenery images coming from two different sensors matriculating of the same scenery in order to confirm the relative displacement of the two images. In this way, we can obtain the vehicle's location in fiducial image known geographical relation, and the precise location information given from image matching location is transmitted to INS to eliminate its location error and greatly enhance the navigation precision of vehicle. Digital image data analysis and processing of image matching in underwater passive navigation is important. In regard to underwater geographic data analysis, we focus on the acquirement, disposal, analysis, expression and measurement of database information. These analysis items structure one of the important contents of underwater terrain matching and are propitious to know the seabed terrain configuration of navigation areas so that the best advantageous seabed terrain district and dependable navigation algorithm can be selected. In this way, we can improve the precision and reliability of terrain assistant navigation system. The pre-process and format transformation of digital image during underwater image matching are expatiated in this paper. The information of the terrain status in navigation areas need further study to provide the reliable data terrain characteristic and underwater overcast for navigation. Through realizing the choice of sea route, danger district prediction and navigating algorithm analysis, TAN can obtain more high location precision and probability, hence provide technological support for image matching of underwater passive navigation.

  19. The equivalency between logic Petri workflow nets and workflow nets.

    PubMed

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  20. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    PubMed Central

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  1. Workflow Management for Complex HEP Analyses

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.

    2017-10-01

    We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.

  2. Application of aircraft navigation sensors to enhanced vision systems

    NASA Technical Reports Server (NTRS)

    Sweet, Barbara T.

    1993-01-01

    In this presentation, the applicability of various aircraft navigation sensors to enhanced vision system design is discussed. First, the accuracy requirements of the FAA for precision landing systems are presented, followed by the current navigation systems and their characteristics. These systems include Instrument Landing System (ILS), Microwave Landing System (MLS), Inertial Navigation, Altimetry, and Global Positioning System (GPS). Finally, the use of navigation system data to improve enhanced vision systems is discussed. These applications include radar image rectification, motion compensation, and image registration.

  3. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    PubMed

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  4. An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles.

    PubMed

    Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo

    2017-03-25

    Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance.

  5. An Effective Terrain Aided Navigation for Low-Cost Autonomous Underwater Vehicles

    PubMed Central

    Zhou, Ling; Cheng, Xianghong; Zhu, Yixian; Dai, Chenxi; Fu, Jinbo

    2017-01-01

    Terrain-aided navigation is a potentially powerful solution for obtaining submerged position fixes for autonomous underwater vehicles. The application of terrain-aided navigation with high-accuracy inertial navigation systems has demonstrated meter-level navigation accuracy in sea trials. However, available sensors may be limited depending on the type of the mission. Such limitations, especially for low-grade navigation sensors, not only degrade the accuracy of traditional navigation systems, but further impact the ability to successfully employ terrain-aided navigation. To address this problem, a tightly-coupled navigation is presented to successfully estimate the critical sensor errors by incorporating raw sensor data directly into an augmented navigation system. Furthermore, three-dimensional distance errors are calculated, providing measurement updates through the particle filter for absolute and bounded position error. The development of the terrain aided navigation system is elaborated for a vehicle equipped with a non-inertial-grade strapdown inertial navigation system, a 4-beam Doppler Velocity Log range sensor and a sonar altimeter. Using experimental data for navigation performance evaluation in areas with different terrain characteristics, the experiment results further show that the proposed method can be successfully applied to the low-cost AUVs and significantly improves navigation performance. PMID:28346346

  6. 33 CFR 62.31 - Special marks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Special marks. 62.31 Section 62.31 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.31 Special marks. Special...

  7. 33 CFR 62.25 - Lateral marks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Lateral marks. 62.25 Section 62.25 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.25 Lateral marks. (a...

  8. 33 CFR 62.32 - Inland waters obstruction mark.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Inland waters obstruction mark. 62.32 Section 62.32 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.32...

  9. 33 CFR 62.33 - Information and regulatory marks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Information and regulatory marks. 62.33 Section 62.33 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.33...

  10. 33 CFR 62.41 - Ranges.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Ranges. 62.41 Section 62.41 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.41 Ranges. Ranges are aids to...

  11. 33 CFR 62.29 - Isolated danger marks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Isolated danger marks. 62.29 Section 62.29 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.29 Isolated danger...

  12. 33 CFR 62.32 - Inland waters obstruction mark.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Inland waters obstruction mark. 62.32 Section 62.32 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.32...

  13. 33 CFR 62.31 - Special marks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Special marks. 62.31 Section 62.31 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.31 Special marks. Special...

  14. 33 CFR 62.29 - Isolated danger marks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Isolated danger marks. 62.29 Section 62.29 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.29 Isolated danger...

  15. 33 CFR 62.33 - Information and regulatory marks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Information and regulatory marks. 62.33 Section 62.33 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.33...

  16. 33 CFR 62.31 - Special marks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Special marks. 62.31 Section 62.31 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.31 Special marks. Special...

  17. 33 CFR 62.41 - Ranges.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Ranges. 62.41 Section 62.41 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.41 Ranges. Ranges are aids to...

  18. 33 CFR 62.25 - Lateral marks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Lateral marks. 62.25 Section 62.25 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.25 Lateral marks. (a...

  19. 33 CFR 62.33 - Information and regulatory marks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Information and regulatory marks. 62.33 Section 62.33 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.33...

  20. 33 CFR 62.25 - Lateral marks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Lateral marks. 62.25 Section 62.25 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.25 Lateral marks. (a...

  1. 33 CFR 62.29 - Isolated danger marks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Isolated danger marks. 62.29 Section 62.29 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.29 Isolated danger...

  2. 33 CFR 62.31 - Special marks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Special marks. 62.31 Section 62.31 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.31 Special marks. Special...

  3. 33 CFR 62.32 - Inland waters obstruction mark.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Inland waters obstruction mark. 62.32 Section 62.32 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.32...

  4. 33 CFR 62.37 - Lighthouses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Lighthouses. 62.37 Section 62.37 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.37 Lighthouses. Lighthouses are...

  5. 33 CFR 62.37 - Lighthouses.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Lighthouses. 62.37 Section 62.37 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.37 Lighthouses. Lighthouses are...

  6. 33 CFR 62.37 - Lighthouses.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Lighthouses. 62.37 Section 62.37 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.37 Lighthouses. Lighthouses are...

  7. 33 CFR 62.33 - Information and regulatory marks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Information and regulatory marks. 62.33 Section 62.33 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.33...

  8. 33 CFR 62.32 - Inland waters obstruction mark.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Inland waters obstruction mark. 62.32 Section 62.32 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.32...

  9. 33 CFR 62.29 - Isolated danger marks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Isolated danger marks. 62.29 Section 62.29 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.29 Isolated danger...

  10. 33 CFR 62.41 - Ranges.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Ranges. 62.41 Section 62.41 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.41 Ranges. Ranges are aids to...

  11. 33 CFR 62.25 - Lateral marks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Lateral marks. 62.25 Section 62.25 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.25 Lateral marks. (a...

  12. 33 CFR 62.37 - Lighthouses.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Lighthouses. 62.37 Section 62.37 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.37 Lighthouses. Lighthouses are...

  13. 33 CFR 62.41 - Ranges.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Ranges. 62.41 Section 62.41 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.41 Ranges. Ranges are aids to...

  14. 33 CFR 62.37 - Lighthouses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Lighthouses. 62.37 Section 62.37 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.37 Lighthouses. Lighthouses are...

  15. 33 CFR 62.31 - Special marks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Special marks. 62.31 Section 62.31 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.31 Special marks. Special...

  16. 33 CFR 62.41 - Ranges.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Ranges. 62.41 Section 62.41 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.41 Ranges. Ranges are aids to...

  17. 33 CFR 62.29 - Isolated danger marks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Isolated danger marks. 62.29 Section 62.29 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.29 Isolated danger...

  18. 33 CFR 62.33 - Information and regulatory marks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Information and regulatory marks. 62.33 Section 62.33 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.33...

  19. 33 CFR 62.32 - Inland waters obstruction mark.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Inland waters obstruction mark. 62.32 Section 62.32 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.32...

  20. 33 CFR 62.25 - Lateral marks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Lateral marks. 62.25 Section 62.25 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.25 Lateral marks. (a...

  1. Using technology to improve and support communication and workflow processes.

    PubMed

    Bahlman, Deborah Tuke; Johnson, Fay C

    2005-07-01

    In conjunction with a large expansion project, a team of perioperative staff members reviewed their workflow processes and designed their ideal patient tracking and communication system. Technologies selected and deployed included a passive infrared tracking system, an enhanced nurse call system, wireless telephones, and a web-based electronic grease board. The new system provides staff members with an easy way to obtain critical pieces of patient information, as well as track the progress of patients and locate equipment.

  2. Research on the error model of airborne celestial/inertial integrated navigation system

    NASA Astrophysics Data System (ADS)

    Zheng, Xiaoqiang; Deng, Xiaoguo; Yang, Xiaoxu; Dong, Qiang

    2015-02-01

    Celestial navigation subsystem of airborne celestial/inertial integrated navigation system periodically correct the positioning error and heading drift of the inertial navigation system, by which the inertial navigation system can greatly improve the accuracy of long-endurance navigation. Thus the navigation accuracy of airborne celestial navigation subsystem directly decides the accuracy of the integrated navigation system if it works for long time. By building the mathematical model of the airborne celestial navigation system based on the inertial navigation system, using the method of linear coordinate transformation, we establish the error transfer equation for the positioning algorithm of airborne celestial system. Based on these we built the positioning error model of the celestial navigation. And then, based on the positioning error model we analyze and simulate the positioning error which are caused by the error of the star tracking platform with the MATLAB software. Finally, the positioning error model is verified by the information of the star obtained from the optical measurement device in range and the device whose location are known. The analysis and simulation results show that the level accuracy and north accuracy of tracking platform are important factors that limit airborne celestial navigation systems to improve the positioning accuracy, and the positioning error have an approximate linear relationship with the level error and north error of tracking platform. The error of the verification results are in 1000m, which shows that the model is correct.

  3. 33 CFR 62.35 - Mooring buoys.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....35 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.35 Mooring buoys. Mooring... identification and to avoid confusion with aids to navigation. ...

  4. 33 CFR 62.35 - Mooring buoys.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....35 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.35 Mooring buoys. Mooring... identification and to avoid confusion with aids to navigation. ...

  5. 33 CFR 62.35 - Mooring buoys.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....35 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.35 Mooring buoys. Mooring... identification and to avoid confusion with aids to navigation. ...

  6. 33 CFR 62.35 - Mooring buoys.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....35 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.35 Mooring buoys. Mooring... identification and to avoid confusion with aids to navigation. ...

  7. 33 CFR 62.35 - Mooring buoys.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....35 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.35 Mooring buoys. Mooring... identification and to avoid confusion with aids to navigation. ...

  8. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  9. Space shuttle navigation analysis. Volume 2: Baseline system navigation

    NASA Technical Reports Server (NTRS)

    Jones, H. L.; Luders, G.; Matchett, G. A.; Rains, R. G.

    1980-01-01

    Studies related to the baseline navigation system for the orbiter are presented. The baseline navigation system studies include a covariance analysis of the Inertial Measurement Unit calibration and alignment procedures, postflight IMU error recovery for the approach and landing phases, on-orbit calibration of IMU instrument biases, and a covariance analysis of entry and prelaunch navigation system performance.

  10. Maximum Correntropy Unscented Kalman Filter for Ballistic Missile Navigation System based on SINS/CNS Deeply Integrated Mode.

    PubMed

    Hou, Bowen; He, Zhangming; Li, Dong; Zhou, Haiyin; Wang, Jiongqi

    2018-05-27

    Strap-down inertial navigation system/celestial navigation system ( SINS/CNS) integrated navigation is a high precision navigation technique for ballistic missiles. The traditional navigation method has a divergence in the position error. A deeply integrated mode for SINS/CNS navigation system is proposed to improve the navigation accuracy of ballistic missile. The deeply integrated navigation principle is described and the observability of the navigation system is analyzed. The nonlinearity, as well as the large outliers and the Gaussian mixture noises, often exists during the actual navigation process, leading to the divergence phenomenon of the navigation filter. The new nonlinear Kalman filter on the basis of the maximum correntropy theory and unscented transformation, named the maximum correntropy unscented Kalman filter, is deduced, and the computational complexity is analyzed. The unscented transformation is used for restricting the nonlinearity of the system equation, and the maximum correntropy theory is used to deal with the non-Gaussian noises. Finally, numerical simulation illustrates the superiority of the proposed filter compared with the traditional unscented Kalman filter. The comparison results show that the large outliers and the influence of non-Gaussian noises for SINS/CNS deeply integrated navigation is significantly reduced through the proposed filter.

  11. 33 CFR 62.27 - Safe water marks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Safe water marks. 62.27 Section 62.27 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.27 Safe water marks. Safe...

  12. Quantifying nursing workflow in medication administration.

    PubMed

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  13. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    PubMed

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.

  14. Navigating Without Road Maps: The Early Business of Automobile Route Guide Publishing in the United States

    NASA Astrophysics Data System (ADS)

    Bauer, John T.

    2018-05-01

    In the United States, automobile route guides were important precursors to the road maps that Americans are familiar with today. Listing turn-by-turn directions between cities, they helped drivers navigate unmarked, local roads. This paper examines the early business of route guide publishing through the Official Automobile Blue Book series of guides. It focuses specifically on the expansion, contraction, and eventual decline of the Blue Book publishing empire and also the work of professional "pathfinders" that formed the company's data-gathering infrastructure. Be- ginning in 1901 with only one volume, the series steadily grew until 1920, when thirteen volumes were required to record thousands of routes throughout the country. Bankruptcy and corporate restructuring in 1921 forced the publishers to condense the guide into a four-volume set in 1922. Competition from emerging sheet maps, along with the nationwide standardization of highway numbers, pushed a switch to an atlas format in 1926. Blue Books, however, could not remain competitive and disappeared after 1937. "Pathfinders" were employed by the publishers and equipped with reliable automobiles. Soon they developed a shorthand notation system for recording field notes and efficiently incorporating them into the development workflow. Although pathfinders did not call themselves cartographers, they were geographical data field collectors and considered their work to be an "art and a science," much the same as modern-day cartographers. The paper concludes with some comments about the place of route guides in the history of American commercial cartography and draws some parallels between "pathfinders" and the digital road mappers of today.

  15. 33 CFR 62.63 - Recommendations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....63 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM Public Participation in the Aids to Navigation System § 62.63 Recommendations. (a) The public may recommend changes to existing aids to navigation, request new aids or the...

  16. 33 CFR 62.63 - Recommendations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....63 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM Public Participation in the Aids to Navigation System § 62.63 Recommendations. (a) The public may recommend changes to existing aids to navigation, request new aids or the...

  17. Optical surgical navigation system causes pulse oximeter malfunction.

    PubMed

    Satoh, Masaaki; Hara, Tetsuhito; Tamai, Kenji; Shiba, Juntaro; Hotta, Kunihisa; Takeuchi, Mamoru; Watanabe, Eiju

    2015-01-01

    An optical surgical navigation system is used as a navigator to facilitate surgical approaches, and pulse oximeters provide valuable information for anesthetic management. However, saw-tooth waves on the monitor of a pulse oximeter and the inability of the pulse oximeter to accurately record the saturation of a percutaneous artery were observed when a surgeon started an optical navigation system. The current case is thought to be the first report of this navigation system interfering with pulse oximetry. The causes of pulse jamming and how to manage an optical navigation system are discussed.

  18. Gravity Gradiometry and Map Matching: An Aid to Aircraft Inertial Navigation Systems

    DTIC Science & Technology

    2010-03-01

    improve its performance. In all of these cases, because information from two or more different navigation systems feeds into a navigation solution...GRAVITY GRADIOMETRY AND MAP MATCHING: AN AID TO AIRCRAFT INERTIAL NAVIGATION SYSTEMS THESIS...M06 GRAVITY GRADIOMETRY AND MAP MATCHING: AN AID TO AIRCRAFT INERTIAL NAVIGATION SYSTEMS THESIS Presented to the Faculty Department of

  19. How the provenance of electronic health record data matters for research: a case example using system mapping.

    PubMed

    Johnson, Karin E; Kamineni, Aruna; Fuller, Sharon; Olmstead, Danielle; Wernli, Karen J

    2014-01-01

    The use of electronic health records (EHRs) for research is proceeding rapidly, driven by computational power, analytical techniques, and policy. However, EHR-based research is limited by the complexity of EHR data and a lack of understanding about data provenance, meaning the context under which the data were collected. This paper presents system flow mapping as a method to help researchers more fully understand the provenance of their EHR data as it relates to local workflow. We provide two specific examples of how this method can improve data identification, documentation, and processing. EHRs store clinical and administrative data, often in unstructured fields. Each clinical system has a unique and dynamic workflow, as well as an EHR customized for local use. The EHR customization may be influenced by a broader context such as documentation required for billing. We present a case study with two examples of using system flow mapping to characterize EHR data for a local colorectal cancer screening process. System flow mapping demonstrated that information entered into the EHR during clinical practice required interpretation and transformation before it could be accurately applied to research. We illustrate how system flow mapping shaped our knowledge of the quality and completeness of data in two examples: (1) determining colonoscopy indication as recorded in the EHR, and (2) discovering a specific EHR form that captured family history. Researchers who do not consider data provenance risk compiling data that are systematically incomplete or incorrect. For example, researchers who are not familiar with the clinical workflow under which data were entered might miss or misunderstand patient information or procedure and diagnostic codes. Data provenance is a fundamental characteristic of research data from EHRs. Given the diversity of EHR platforms and system workflows, researchers need tools for evaluating and reporting data availability, quality, and transformations. Our case study illustrates how system mapping can inform researchers about the provenance of their data as it pertains to local workflows.

  20. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks

    PubMed Central

    Bleser, Gabriele; Damen, Dima; Behera, Ardhendu; Hendeby, Gustaf; Mura, Katharina; Miezal, Markus; Gee, Andrew; Petersen, Nils; Maçães, Gustavo; Domingues, Hugo; Gorecky, Dominic; Almeida, Luis; Mayol-Cuevas, Walterio; Calway, Andrew; Cohn, Anthony G.; Hogg, David C.; Stricker, Didier

    2015-01-01

    Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1) fusion of inertial and visual sensor information from an on-body sensor network (BSN) to robustly track the user’s pose in magnetically polluted environments; 2) learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3) domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4) context-sensitive augmented reality (AR) user feedback using a head-mounted display (HMD). A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen technology as a combined entity as well as point out limitations of the system. PMID:26126116

  1. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    PubMed

    de Carvalho, Elias Cesar Araujo; Batilana, Adelia Portero; Claudino, Wederson; Reis, Luiz Fernando Lima; Schmerling, Rafael A; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial sponsors should improve relevant support systems.

  2. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    PubMed Central

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial sponsors should improve relevant support systems. PMID:22768105

  3. 33 CFR 62.27 - Safe water marks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Safe water marks. 62.27 Section 62.27 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.27 Safe water marks. Safe water marks indicate that there is...

  4. Using Workflows to Explore and Optimise Named Entity Recognition for Chemistry

    PubMed Central

    Kolluru, BalaKrishna; Hawizy, Lezan; Murray-Rust, Peter; Tsujii, Junichi; Ananiadou, Sophia

    2011-01-01

    Chemistry text mining tools should be interoperable and adaptable regardless of system-level implementation, installation or even programming issues. We aim to abstract the functionality of these tools from the underlying implementation via reconfigurable workflows for automatically identifying chemical names. To achieve this, we refactored an established named entity recogniser (in the chemistry domain), OSCAR and studied the impact of each component on the net performance. We developed two reconfigurable workflows from OSCAR using an interoperable text mining framework, U-Compare. These workflows can be altered using the drag-&-drop mechanism of the graphical user interface of U-Compare. These workflows also provide a platform to study the relationship between text mining components such as tokenisation and named entity recognition (using maximum entropy Markov model (MEMM) and pattern recognition based classifiers). Results indicate that, for chemistry in particular, eliminating noise generated by tokenisation techniques lead to a slightly better performance than others, in terms of named entity recognition (NER) accuracy. Poor tokenisation translates into poorer input to the classifier components which in turn leads to an increase in Type I or Type II errors, thus, lowering the overall performance. On the Sciborg corpus, the workflow based system, which uses a new tokeniser whilst retaining the same MEMM component, increases the F-score from 82.35% to 84.44%. On the PubMed corpus, it recorded an F-score of 84.84% as against 84.23% by OSCAR. PMID:21633495

  5. Using workflows to explore and optimise named entity recognition for chemistry.

    PubMed

    Kolluru, Balakrishna; Hawizy, Lezan; Murray-Rust, Peter; Tsujii, Junichi; Ananiadou, Sophia

    2011-01-01

    Chemistry text mining tools should be interoperable and adaptable regardless of system-level implementation, installation or even programming issues. We aim to abstract the functionality of these tools from the underlying implementation via reconfigurable workflows for automatically identifying chemical names. To achieve this, we refactored an established named entity recogniser (in the chemistry domain), OSCAR and studied the impact of each component on the net performance. We developed two reconfigurable workflows from OSCAR using an interoperable text mining framework, U-Compare. These workflows can be altered using the drag-&-drop mechanism of the graphical user interface of U-Compare. These workflows also provide a platform to study the relationship between text mining components such as tokenisation and named entity recognition (using maximum entropy Markov model (MEMM) and pattern recognition based classifiers). Results indicate that, for chemistry in particular, eliminating noise generated by tokenisation techniques lead to a slightly better performance than others, in terms of named entity recognition (NER) accuracy. Poor tokenisation translates into poorer input to the classifier components which in turn leads to an increase in Type I or Type II errors, thus, lowering the overall performance. On the Sciborg corpus, the workflow based system, which uses a new tokeniser whilst retaining the same MEMM component, increases the F-score from 82.35% to 84.44%. On the PubMed corpus, it recorded an F-score of 84.84% as against 84.23% by OSCAR.

  6. 33 CFR 62.53 - Racons.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.53 Racons. (a) Aids to navigation may... non-laterally significant aids alike, the racon signal itself is for identification purposes only, and...

  7. 33 CFR 62.53 - Racons.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.53 Racons. (a) Aids to navigation may... non-laterally significant aids alike, the racon signal itself is for identification purposes only, and...

  8. 33 CFR 62.53 - Racons.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.53 Racons. (a) Aids to navigation may... non-laterally significant aids alike, the racon signal itself is for identification purposes only, and...

  9. 33 CFR 62.23 - Beacons and buoys.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... navigation. The primary components of the U.S. Aids to Navigation System are beacons and buoys. (b) Beacons are aids to navigation structures which are permanently fixed to the earth's surface. They range from... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.23 Beacons and buoys. (a...

  10. 33 CFR 62.23 - Beacons and buoys.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... navigation. The primary components of the U.S. Aids to Navigation System are beacons and buoys. (b) Beacons are aids to navigation structures which are permanently fixed to the earth's surface. They range from... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.23 Beacons and buoys. (a...

  11. 33 CFR 62.23 - Beacons and buoys.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... navigation. The primary components of the U.S. Aids to Navigation System are beacons and buoys. (b) Beacons are aids to navigation structures which are permanently fixed to the earth's surface. They range from... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.23 Beacons and buoys. (a...

  12. 33 CFR 62.23 - Beacons and buoys.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... navigation. The primary components of the U.S. Aids to Navigation System are beacons and buoys. (b) Beacons are aids to navigation structures which are permanently fixed to the earth's surface. They range from... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.23 Beacons and buoys. (a...

  13. 33 CFR 62.23 - Beacons and buoys.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... navigation. The primary components of the U.S. Aids to Navigation System are beacons and buoys. (b) Beacons are aids to navigation structures which are permanently fixed to the earth's surface. They range from... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.23 Beacons and buoys. (a...

  14. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using “service casts” and “interest casts” (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH’s Mining Workflow Composer and the open-source Active BPEL engine, and JPL’s SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the “sociological” problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  15. 33 CFR 62.54 - Ownership identification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Section 62.54 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.54 Ownership identification. Ownership identification on private or state aids to navigation is permitted so long as it does...

  16. 33 CFR 62.54 - Ownership identification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Section 62.54 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.54 Ownership identification. Ownership identification on private or state aids to navigation is permitted so long as it does...

  17. 33 CFR 62.54 - Ownership identification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Section 62.54 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.54 Ownership identification. Ownership identification on private or state aids to navigation is permitted so long as it does...

  18. 33 CFR 62.54 - Ownership identification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Section 62.54 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.54 Ownership identification. Ownership identification on private or state aids to navigation is permitted so long as it does...

  19. 33 CFR 62.54 - Ownership identification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Section 62.54 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.54 Ownership identification. Ownership identification on private or state aids to navigation is permitted so long as it does...

  20. 33 CFR 62.45 - Light characteristics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Light characteristics. 62.45... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.45 Light characteristics. (a) Lights on aids to navigation are differentiated by color and rhythm. Lighthouses and range...

  1. 33 CFR 62.45 - Light characteristics.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Light characteristics. 62.45... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.45 Light characteristics. (a) Lights on aids to navigation are differentiated by color and rhythm. Lighthouses and range...

  2. 33 CFR 62.45 - Light characteristics.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Light characteristics. 62.45... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.45 Light characteristics. (a) Lights on aids to navigation are differentiated by color and rhythm. Lighthouses and range...

  3. 33 CFR 62.45 - Light characteristics.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Light characteristics. 62.45... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.45 Light characteristics. (a) Lights on aids to navigation are differentiated by color and rhythm. Lighthouses and range...

  4. 33 CFR 62.45 - Light characteristics.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Light characteristics. 62.45... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.45 Light characteristics. (a) Lights on aids to navigation are differentiated by color and rhythm. Lighthouses and range...

  5. Area navigation and required navigation performance procedures and depictions

    DOT National Transportation Integrated Search

    2012-09-30

    Area navigation (RNAV) and required navigation performance (RNP) procedures are fundamental to the implementation of a performance based navigation (PBN) system, which is a key enabling technology for the Next Generation Air Transportation System (Ne...

  6. An on-line monitoring system for navigation equipment

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Yang, Ping; Liu, Jing; Yang, Zhengbo; Liang, Fei

    2017-10-01

    Civil air navigation equipment is the most important infrastructure of Civil Aviation, which is closely related to flight safety. In addition to regular flight inspection, navigation equipment's patrol measuring, maintenance measuring, running measuring under special weather conditions are the important means of ensuring aviation flight safety. According to the safety maintenance requirements of Civil Aviation Air Traffic Control navigation equipment, this paper developed one on-line monitoring system with independent intellectual property rights for navigation equipment, the system breakthroughs the key technologies of measuring navigation equipment on-line including Instrument Landing System (ILS) and VHF Omni-directional Range (VOR), which also meets the requirements of navigation equipment ground measurement set by the ICAO DOC 8071, it provides technical means of the ground on-line measurement for navigation equipment, improves the safety of navigation equipment operation, and reduces the impact of measuring navigation equipment on airport operation.

  7. Using Technology, Clinical Workflow Redesign, and Team Solutions to Achieve the Patient Centered Medical Home

    DTIC Science & Technology

    2011-01-01

    The Quadruple Aim: Working Together, Achieving Success 2011 Military Health System Conference TMA and Services Using Technology, Clinical Workflow...Redesign, and Team Solutions to Achieve the Patient Centered Medical Home LTC Nicole Kerkenbush, MHA, MN Army Medical Department, Office of the...Surgeon General Chief Medical Information Officer 1 Military Health System Conference Report Documentation Page Form ApprovedOMB No. 0704-0188 Public

  8. An access control model with high security for distributed workflow and real-time application

    NASA Astrophysics Data System (ADS)

    Han, Ruo-Fei; Wang, Hou-Xiang

    2007-11-01

    The traditional mandatory access control policy (MAC) is regarded as a policy with strict regulation and poor flexibility. The security policy of MAC is so compelling that few information systems would adopt it at the cost of facility, except some particular cases with high security requirement as military or government application. However, with the increasing requirement for flexibility, even some access control systems in military application have switched to role-based access control (RBAC) which is well known as flexible. Though RBAC can meet the demands for flexibility but it is weak in dynamic authorization and consequently can not fit well in the workflow management systems. The task-role-based access control (T-RBAC) is then introduced to solve the problem. It combines both the advantages of RBAC and task-based access control (TBAC) which uses task to manage permissions dynamically. To satisfy the requirement of system which is distributed, well defined with workflow process and critically for time accuracy, this paper will analyze the spirit of MAC, introduce it into the improved T&RBAC model which is based on T-RBAC. At last, a conceptual task-role-based access control model with high security for distributed workflow and real-time application (A_T&RBAC) is built, and its performance is simply analyzed.

  9. The BioExtract Server: a web-based bioinformatic workflow platform

    PubMed Central

    Lushbough, Carol M.; Jennewein, Douglas M.; Brendel, Volker P.

    2011-01-01

    The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet. PMID:21546552

  10. Design of all-weather celestial navigation system

    NASA Astrophysics Data System (ADS)

    Sun, Hongchi; Mu, Rongjun; Du, Huajun; Wu, Peng

    2018-03-01

    In order to realize autonomous navigation in the atmosphere, an all-weather celestial navigation system is designed. The research of celestial navigation system include discrimination method of comentropy and the adaptive navigation algorithm based on the P value. The discrimination method of comentropy is studied to realize the independent switching of two celestial navigation modes, starlight and radio. Finally, an adaptive filtering algorithm based on P value is proposed, which can greatly improve the disturbance rejection capability of the system. The experimental results show that the accuracy of the three axis attitude is better than 10″, and it can work all weather. In perturbation environment, the position accuracy of the integrated navigation system can be increased 20% comparing with the traditional method. It basically meets the requirements of the all-weather celestial navigation system, and it has the ability of stability, reliability, high accuracy and strong anti-interference.

  11. Imaging workflow and calibration for CT-guided time-domain fluorescence tomography

    PubMed Central

    Tichauer, Kenneth M.; Holt, Robert W.; El-Ghussein, Fadi; Zhu, Qun; Dehghani, Hamid; Leblond, Frederic; Pogue, Brian W.

    2011-01-01

    In this study, several key optimization steps are outlined for a non-contact, time-correlated single photon counting small animal optical tomography system, using simultaneous collection of both fluorescence and transmittance data. The system is presented for time-domain image reconstruction in vivo, illustrating the sensitivity from single photon counting and the calibration steps needed to accurately process the data. In particular, laser time- and amplitude-referencing, detector and filter calibrations, and collection of a suitable instrument response function are all presented in the context of time-domain fluorescence tomography and a fully automated workflow is described. Preliminary phantom time-domain reconstructed images demonstrate the fidelity of the workflow for fluorescence tomography based on signal from multiple time gates. PMID:22076264

  12. A pattern-based analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Mulyar, Nataliya; van der Aalst, Wil M P; Peleg, Mor

    2007-01-01

    Languages used to specify computer-interpretable guidelines (CIGs) differ in their approaches to addressing particular modeling challenges. The main goals of this article are: (1) to examine the expressive power of CIG modeling languages, and (2) to define the differences, from the control-flow perspective, between process languages in workflow management systems and modeling languages used to design clinical guidelines. The pattern-based analysis was applied to guideline modeling languages Asbru, EON, GLIF, and PROforma. We focused on control-flow and left other perspectives out of consideration. We evaluated the selected CIG modeling languages and identified their degree of support of 43 control-flow patterns. We used a set of explicitly defined evaluation criteria to determine whether each pattern is supported directly, indirectly, or not at all. PROforma offers direct support for 22 of 43 patterns, Asbru 20, GLIF 17, and EON 11. All four directly support basic control-flow patterns, cancellation patterns, and some advance branching and synchronization patterns. None support multiple instances patterns. They offer varying levels of support for synchronizing merge patterns and state-based patterns. Some support a few scenarios not covered by the 43 control-flow patterns. CIG modeling languages are remarkably close to traditional workflow languages from the control-flow perspective, but cover many fewer workflow patterns. CIG languages offer some flexibility that supports modeling of complex decisions and provide ways for modeling some decisions not covered by workflow management systems. Workflow management systems may be suitable for clinical guideline applications.

  13. a New Survey on Self-Tuning Integrated Low-Cost Gps/ins Vehicle Navigation System in Harsh Environment

    NASA Astrophysics Data System (ADS)

    Navidi, N.; Landry, R., Jr.

    2015-08-01

    Nowadays, Global Positioning System (GPS) receivers are aided by some complementary radio navigation systems and Inertial Navigation Systems (INS) to obtain more accuracy and robustness in land vehicular navigation. Extended Kalman Filter (EKF) is an acceptable conventional method to estimate the position, the velocity, and the attitude of the navigation system when INS measurements are fused with GPS data. However, the usage of the low-cost Inertial Measurement Units (IMUs) based on the Micro-Electro-Mechanical Systems (MEMS), for the land navigation systems, reduces the precision and stability of the navigation system due to their inherent errors. The main goal of this paper is to provide a new model for fusing low-cost IMU and GPS measurements. The proposed model is based on EKF aided by Fuzzy Inference Systems (FIS) as a promising method to solve the mentioned problems. This model considers the parameters of the measurement noise to adjust the measurement and noise process covariance. The simulation results show the efficiency of the proposed method to reduce the navigation system errors compared with EKF.

  14. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  15. Making Sense of Complexity with FRE, a Scientific Workflow System for Climate Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Langenhorst, A. R.; Balaji, V.; Yakovlev, A.

    2010-12-01

    A workflow is a description of a sequence of activities that is both precise and comprehensive. Capturing the workflow of climate experiments provides a record which can be queried or compared, and allows reproducibility of the experiments - sometimes even to the bit level of the model output. This reproducibility helps to verify the integrity of the output data, and enables easy perturbation experiments. GFDL's Flexible Modeling System Runtime Environment (FRE) is a production-level software project which defines and implements building blocks of the workflow as command line tools. The scientific, numerical and technical input needed to complete the workflow of an experiment is recorded in an experiment description file in XML format. Several key features add convenience and automation to the FRE workflow: ● Experiment inheritance makes it possible to define a new experiment with only a reference to the parent experiment and the parameters to override. ● Testing is a basic element of the FRE workflow: experiments define short test runs which are verified before the main experiment is run, and a set of standard experiments are verified with new code releases. ● FRE is flexible enough to support short runs with mere megabytes of data, to high-resolution experiments that run on thousands of processors for months, producing terabytes of output data. Experiments run in segments of model time; after each segment, the state is saved and the model can be checkpointed at that level. Segment length is defined by the user, but the number of segments per system job is calculated to fit optimally in the batch scheduler requirements. FRE provides job control across multiple segments, and tools to monitor and alter the state of long-running experiments. ● Experiments are entered into a Curator Database, which stores query-able metadata about the experiment and the experiment's output. ● FRE includes a set of standardized post-processing functions as well as the ability to incorporate user-level functions. FRE post-processing can take us all the way to the preparing of graphical output for a scientific audience, and publication of data on a public portal. ● Recent FRE development includes incorporating a distributed workflow to support remote computing.

  16. Method, accuracy and limitation of computer interaction in the operating room by a navigated surgical instrument.

    PubMed

    Hurka, Florian; Wenger, Thomas; Heininger, Sebastian; Lueth, Tim C

    2011-01-01

    This article describes a new interaction device for surgical navigation systems--the so-called navigation mouse system. The idea is to use a tracked instrument of a surgical navigation system like a pointer to control the software. The new interaction system extends existing navigation systems with a microcontroller-unit. The microcontroller-unit uses the existing communication line to extract the needed 3D-information of an instrument to calculate positions analogous to the PC mouse cursor and click events. These positions and events are used to manipulate the navigation system. In an experimental setup the reachable accuracy with the new mouse system is shown.

  17. Integrated system for point cloud reconstruction and simulated brain shift validation using tracked surgical microscope

    NASA Astrophysics Data System (ADS)

    Yang, Xiaochen; Clements, Logan W.; Luo, Ma; Narasimhan, Saramati; Thompson, Reid C.; Dawant, Benoit M.; Miga, Michael I.

    2017-03-01

    Intra-operative soft tissue deformation, referred to as brain shift, compromises the application of current imageguided surgery (IGS) navigation systems in neurosurgery. A computational model driven by sparse data has been used as a cost effective method to compensate for cortical surface and volumetric displacements. Stereoscopic microscopes and laser range scanners (LRS) are the two most investigated sparse intra-operative imaging modalities for driving these systems. However, integrating these devices in the clinical workflow to facilitate development and evaluation requires developing systems that easily permit data acquisition and processing. In this work we present a mock environment developed to acquire stereo images from a tracked operating microscope and to reconstruct 3D point clouds from these images. A reconstruction error of 1 mm is estimated by using a phantom with a known geometry and independently measured deformation extent. The microscope is tracked via an attached tracking rigid body that facilitates the recording of the position of the microscope via a commercial optical tracking system as it moves during the procedure. Point clouds, reconstructed under different microscope positions, are registered into the same space in order to compute the feature displacements. Using our mock craniotomy device, realistic cortical deformations are generated. Our experimental results report approximately 2mm average displacement error compared with the optical tracking system. These results demonstrate the practicality of using tracked stereoscopic microscope as an alternative to LRS to collect sufficient intraoperative information for brain shift correction.

  18. Validating automatic semantic annotation of anatomy in DICOM CT images

    NASA Astrophysics Data System (ADS)

    Pathak, Sayan D.; Criminisi, Antonio; Shotton, Jamie; White, Steve; Robertson, Duncan; Sparks, Bobbi; Munasinghe, Indeera; Siddiqui, Khan

    2011-03-01

    In the current health-care environment, the time available for physicians to browse patients' scans is shrinking due to the rapid increase in the sheer number of images. This is further aggravated by mounting pressure to become more productive in the face of decreasing reimbursement. Hence, there is an urgent need to deliver technology which enables faster and effortless navigation through sub-volume image visualizations. Annotating image regions with semantic labels such as those derived from the RADLEX ontology can vastly enhance image navigation and sub-volume visualization. This paper uses random regression forests for efficient, automatic detection and localization of anatomical structures within DICOM 3D CT scans. A regression forest is a collection of decision trees which are trained to achieve direct mapping from voxels to organ location and size in a single pass. This paper focuses on comparing automated labeling with expert-annotated ground-truth results on a database of 50 highly variable CT scans. Initial investigations show that regression forest derived localization errors are smaller and more robust than those achieved by state-of-the-art global registration approaches. The simplicity of the algorithm's context-rich visual features yield typical runtimes of less than 10 seconds for a 5123 voxel DICOM CT series on a single-threaded, single-core machine running multiple trees; each tree taking less than a second. Furthermore, qualitative evaluation demonstrates that using the detected organs' locations as index into the image volume improves the efficiency of the navigational workflow in all the CT studies.

  19. Implementation of a vector-based tracking loop receiver in a pseudolite navigation system.

    PubMed

    So, Hyoungmin; Lee, Taikjin; Jeon, Sanghoon; Kim, Chongwon; Kee, Changdon; Kim, Taehee; Lee, Sanguk

    2010-01-01

    We propose a vector tracking loop (VTL) algorithm for an asynchronous pseudolite navigation system. It was implemented in a software receiver and experiments in an indoor navigation system were conducted. Test results show that the VTL successfully tracks signals against the near-far problem, one of the major limitations in pseudolite navigation systems, and could improve positioning availability by extending pseudolite navigation coverage.

  20. Institute of Navigation, Annual Meeting, 47th, Williamsburg, VA, June 10-12, 1991, Proceedings

    NASA Astrophysics Data System (ADS)

    1991-11-01

    The present volume of navigation and exploration discusses space exploration, mapping and geodesy, aircraft navigation, undersea navigation, land and vehicular location, international and legal aspects of navigation, the history of navigation technology and applications, Loran development and implementation, GPS and GLONASS developments, and search and rescue. Topics addressed include stabilization of low orbiting spacecraft using GPS, the employment of laser navigation for automatic rendezvous and docking systems, enhanced pseudostatic processing, and the expanding role of sensor fusion. Attention is given to a gravity-aided inertial navigation system, recent developments in aviation products liability and navigation, the ICAO future air navigation system, and Loran's implementation in NAS. Also discussed are Inmarsat integrated navigation/communication activities, the GPS program status, the evolution of military GPS technology into the Navcore V receiver engine, and Sarsat location algorithms.

  1. Content and Workflow Management for Library Websites: Case Studies

    ERIC Educational Resources Information Center

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  2. 33 CFR 146.105 - General alarm system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... manned facility must have a general alarm system. When operated, this system shall be audible in all... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false General alarm system. 146.105 Section 146.105 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED...

  3. Interference and deception detection technology of satellite navigation based on deep learning

    NASA Astrophysics Data System (ADS)

    Chen, Weiyi; Deng, Pingke; Qu, Yi; Zhang, Xiaoguang; Li, Yaping

    2017-10-01

    Satellite navigation system plays an important role in people's daily life and war. The strategic position of satellite navigation system is prominent, so it is very important to ensure that the satellite navigation system is not disturbed or destroyed. It is a critical means to detect the jamming signal to avoid the accident in a navigation system. At present, the detection technology of jamming signal in satellite navigation system is not intelligent , mainly relying on artificial decision and experience. For this issue, the paper proposes a method based on deep learning to monitor the interference source in a satellite navigation. By training the interference signal data, and extracting the features of the interference signal, the detection sys tem model is constructed. The simulation results show that, the detection accuracy of our detection system can reach nearly 70%. The method in our paper provides a new idea for the research on intelligent detection of interference and deception signal in a satellite navigation system.

  4. Modified Navigation Instructions for Spatial Navigation Assistance Systems Lead to Incidental Spatial Learning

    PubMed Central

    Gramann, Klaus; Hoepner, Paul; Karrer-Gauss, Katja

    2017-01-01

    Spatial cognitive skills deteriorate with the increasing use of automated GPS navigation and a general decrease in the ability to orient in space might have further impact on independence, autonomy, and quality of life. In the present study we investigate whether modified navigation instructions support incidental spatial knowledge acquisition. A virtual driving environment was used to examine the impact of modified navigation instructions on spatial learning while using a GPS navigation assistance system. Participants navigated through a simulated urban and suburban environment, using navigation support to reach their destination. Driving performance as well as spatial learning was thereby assessed. Three navigation instruction conditions were tested: (i) a control group that was provided with classical navigation instructions at decision points, and two other groups that received navigation instructions at decision points including either (ii) additional irrelevant information about landmarks or (iii) additional personally relevant information (i.e., individual preferences regarding food, hobbies, etc.), associated with landmarks. Driving performance revealed no differences between navigation instructions. Significant improvements were observed in both modified navigation instruction conditions on three different measures of spatial learning and memory: subsequent navigation of the initial route without navigation assistance, landmark recognition, and sketch map drawing. Future navigation assistance systems could incorporate modified instructions to promote incidental spatial learning and to foster more general spatial cognitive abilities. Such systems might extend mobility across the lifespan. PMID:28243219

  5. Intraoperative computed tomography with an integrated navigation system in stabilization surgery for complex craniovertebral junction malformation.

    PubMed

    Yu, Xinguang; Li, Lianfeng; Wang, Peng; Yin, Yiheng; Bu, Bo; Zhou, Dingbiao

    2014-07-01

    This study was designed to report our preliminary experience with stabilization procedures for complex craniovertebral junction malformation (CVJM) using intraoperative computed tomography (iCT) with an integrated neuronavigation system (NNS). To evaluate the workflow, feasibility and clinical outcome of stabilization procedures using iCT image-guided navigation for complex CVJM. The stabilization procedures in CVJM are complex because of the area's intricate geometry and bony structures, its critical relationship to neurovascular structures and the intricate biomechanical issues involved. A sliding gantry 40-slice computed tomography scanner was installed in a preexisting operating room. The images were transferred directly from the scanner to the NNS using an automated registration system. On the basis of the analysis of intraoperative computed tomographic images, 23 cases (11 males, 12 females) with complicated CVJM underwent navigated stabilization procedures to allow more control over screw placement. The age of these patients were 19-52 years (mean: 33.5 y). We performed C1-C2 transarticular screw fixation in 6 patients to produce atlantoaxial arthrodesis with better reliability. Because of a high-riding transverse foramen on at least 1 side of the C2 vertebra and an anomalous vertebral artery position, 7 patients underwent C1 lateral mass and C2 pedicle screw fixation. Ten additional patients were treated with individualized occipitocervical fixation surgery from the hypoplasia of C1 or constraints due to C2 bone structure. In total, 108 screws were inserted into 23 patients using navigational assistance. The screws comprised 20 C1 lateral mass screws, 26 C2, 14 C3, or 4 C4 pedicle screws, 32 occipital screws, and 12 C1-C2 transarticular screws. There were no vascular or neural complications except for pedicle perforations that were detected in 2 (1.9%) patients and were corrected intraoperatively without any persistent nerves or vessel damage. The overall accuracy of the image guidance system was 98.1%. The duration of interruption during the surgical process for the iCT was 8±1.5 minutes. All patients were clinically evaluated using Nurick grade criteria and for neurological deficits 3 months after surgery. Twenty-one patients (91.3%) improved by at least 1 Nurick grade, whereas the grade remained unchanged in 2 (8.7%) patients. Craniovertebral stability and solid bone fusion was achieved in all patients. NNS was found to correlate well with the intraoperative findings, and the recalibration was uneventful in all cases and had an accuracy of 1.8 mm (range, 0.6-2.2 mm). iCT scanning with integrated NNS was found to be both feasible and beneficial in the stabilization procedures for complex CVJM. In this unusual patient population, the technique seemed to be of value for negotiating complex anatomy and for achieving more control over screw placement.

  6. Mission Operations and Navigation Toolkit Environment

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Hanna, Robert A.; Mossey, Michael P.; Duncan, Courtney B.; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.; Martin Mur, Tomas J.; hide

    2009-01-01

    MONTE (Mission Operations and Navigation Toolkit Environment) Release 7.3 is an extensible software system designed to support trajectory and navigation analysis/design for space missions. MONTE is intended to replace the current navigation and trajectory analysis software systems, which, at the time of this reporting, are used by JPL's Navigation and Mission Design section. The software provides an integrated, simplified, and flexible system that can be easily maintained to serve the needs of future missions in need of navigation services.

  7. A navigation system for the visually impaired an intelligent white cane.

    PubMed

    Fukasawa, A Jin; Magatani, Kazusihge

    2012-01-01

    In this paper, we describe about a developed navigation system that supports the independent walking of the visually impaired in the indoor space. Our developed instrument consists of a navigation system and a map information system. These systems are installed on a white cane. Our navigation system can follow a colored navigation line that is set on the floor. In this system, a color sensor installed on the tip of a white cane, this sensor senses a color of navigation line and the system informs the visually impaired that he/she is walking along the navigation line by vibration. This color recognition system is controlled by a one-chip microprocessor. RFID tags and a receiver for these tags are used in the map information system. RFID tags are set on the colored navigation line. An antenna for RFID tags and a tag receiver are also installed on a white cane. The receiver receives the area information as a tag-number and notifies map information to the user by mp3 formatted pre-recorded voice. And now, we developed the direction identification technique. Using this technique, we can detect a user's walking direction. A triaxiality acceleration sensor is used in this system. Three normal subjects who were blindfolded with an eye mask were tested with our developed navigation system. All of them were able to walk along the navigation line perfectly. We think that the performance of the system is good. Therefore, our system will be extremely valuable in supporting the activities of the visually impaired.

  8. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  9. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    PubMed

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  11. A novel platform for electromagnetic navigated ultrasound bronchoscopy (EBUS).

    PubMed

    Sorger, Hanne; Hofstad, Erlend Fagertun; Amundsen, Tore; Langø, Thomas; Leira, Håkon Olav

    2016-08-01

    Endobronchial ultrasound transbronchial needle aspiration (EBUS-TBNA) of mediastinal lymph nodes is essential for lung cancer staging and distinction between curative and palliative treatment. Precise sampling is crucial. Navigation and multimodal imaging may improve the efficiency of EBUS-TBNA. We demonstrate a novel EBUS-TBNA navigation system in a dedicated airway phantom. Using a convex probe EBUS bronchoscope (CP-EBUS) with an integrated sensor for electromagnetic (EM) position tracking, we performed navigated CP-EBUS in a phantom. Preoperative computed tomography (CT) and real-time ultrasound (US) images were integrated into a navigation platform for EM navigated bronchoscopy. The coordinates of targets in CT and US volumes were registered in the navigation system, and the position deviation was calculated. The system visualized all tumor models and displayed their fused CT and US images in correct positions in the navigation system. Navigating the EBUS bronchoscope was fast and easy. Mean error observed between US and CT positions for 11 target lesions (37 measurements) was [Formula: see text] mm, maximum error was 5.9 mm. The feasibility of our novel navigated CP-EBUS system was successfully demonstrated. An EBUS navigation system is needed to meet future requirements of precise mediastinal lymph node mapping, and provides new opportunities for procedure documentation in EBUS-TBNA.

  12. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    PubMed Central

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  13. Enabling Real-time Water Decision Support Services Using Model as a Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  14. Characterizing Strain Variation in Engineered E. coli Using a Multi-Omics-Based Workflow

    DOE PAGES

    Brunk, Elizabeth; George, Kevin W.; Alonso-Gutierrez, Jorge; ...

    2016-05-19

    Understanding the complex interactions that occur between heterologous and native biochemical pathways represents a major challenge in metabolic engineering and synthetic biology. We present a workflow that integrates metabolomics, proteomics, and genome-scale models of Escherichia coli metabolism to study the effects of introducing a heterologous pathway into a microbial host. This workflow incorporates complementary approaches from computational systems biology, metabolic engineering, and synthetic biology; provides molecular insight into how the host organism microenvironment changes due to pathway engineering; and demonstrates how biological mechanisms underlying strain variation can be exploited as an engineering strategy to increase product yield. As a proofmore » of concept, we present the analysis of eight engineered strains producing three biofuels: isopentenol, limonene, and bisabolene. Application of this workflow identified the roles of candidate genes, pathways, and biochemical reactions in observed experimental phenomena and facilitated the construction of a mutant strain with improved productivity. The contributed workflow is available as an open-source tool in the form of iPython notebooks.« less

  15. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    PubMed

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  16. Camera-augmented mobile C-arm (CamC): A feasibility study of augmented reality imaging in the operating room.

    PubMed

    von der Heide, Anna Maria; Fallavollita, Pascal; Wang, Lejing; Sandner, Philipp; Navab, Nassir; Weidert, Simon; Euler, Ekkehard

    2018-04-01

    In orthopaedic trauma surgery, image-guided procedures are mostly based on fluoroscopy. The reduction of radiation exposure is an important goal. The purpose of this work was to investigate the impact of a camera-augmented mobile C-arm (CamC) on radiation exposure and the surgical workflow during a first clinical trial. Applying a workflow-oriented approach, 10 general workflow steps were defined to compare the CamC to traditional C-arms. The surgeries included were arbitrarily identified and assigned to the study. The evaluation criteria were radiation exposure and operation time for each workflow step and the entire surgery. The evaluation protocol was designed and conducted in a single-centre study. The radiation exposure was remarkably reduced by 18 X-ray shots 46% using the CamC while keeping similar surgery times. The intuitiveness of the system, its easy integration into the surgical workflow, and its great potential to reduce radiation have been demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Regionalized Lunar South Pole Surface Navigation System Analysis

    NASA Technical Reports Server (NTRS)

    Welch, Bryan W.

    2008-01-01

    Apollo missions utilized Earth-based assets for navigation because the landings took place at lunar locations in constant view from the Earth. The new exploration campaign to the lunar south pole region will have limited Earth visibility, but the extent to which a navigation system comprised solely of Earth-based tracking stations will provide adequate navigation solutions in this region is unknown. This report presents a dilution-of-precision (DoP)-based, stationary surface navigation analysis of the performance of multiple lunar satellite constellations, Earth-based deep space network assets, and combinations thereof. Results show that kinematic and integrated solutions cannot be provided by the Earth-based deep space network stations. Also, the stationary surface navigation system needs to be operated either as a two-way navigation system or as a one-way navigation system with local terrain information, while the position solution is integrated over a short duration of time with navigation signals being provided by a lunar satellite constellation.

  18. FlyAR: augmented reality supported micro aerial vehicle navigation.

    PubMed

    Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard

    2014-04-01

    Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicle’s position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the user’s view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.

  19. Hybrid surgical guidance based on the integration of radionuclear and optical technologies

    PubMed Central

    Valdés-Olmos, Renato; Buckle, Tessa; Vidal-Sicart, Sergi

    2016-01-01

    With the evolution of imaging technologies and tracers, the applications for nuclear molecular imaging are growing rapidly. For example, nuclear medicine is increasingly being used to guide surgical resections in complex anatomical locations. Here, a future workflow is envisioned that uses a combination of pre-operative diagnostics, navigation and intraoperative guidance. Radioguidance can provide means for pre-operative and intraoperative identification of “hot” lesions, forming the basis of a virtual data set that can be used for navigation. Luminescence guidance has shown great potential in the intraoperative setting by providing optical feedback, in some cases even in real time. Both of these techniques have distinct drawbacks, which include inaccuracy in areas that contain a background signal (radioactivity) or a limited degree of signal penetration (luminescence). We, and others, have reasoned that hybrid/multimodal approaches that integrate the use of these complementary modalities may help overcome their individual weaknesses. Ultimately, this will lead to advancement of the field of interventional molecular imaging/image-guided surgery. In this review, an overview of clinically applied hybrid surgical guidance technologies is given, whereby the focus is placed on tracers and hardware. PMID:26943463

  20. Sensitivity analysis of helicopter IMC decelerating steep approach and landing performance to navigation system parameters

    NASA Technical Reports Server (NTRS)

    Karmali, M. S.; Phatak, A. V.

    1982-01-01

    Results of a study to investigate, by means of a computer simulation, the performance sensitivity of helicopter IMC DSAL operations as a function of navigation system parameters are presented. A mathematical model representing generically a navigation system is formulated. The scenario simulated consists of a straight in helicopter approach to landing along a 6 deg glideslope. The deceleration magnitude chosen is 03g. The navigation model parameters are varied and the statistics of the total system errors (TSE) computed. These statistics are used to determine the critical navigation system parameters that affect the performance of the closed-loop navigation, guidance and control system of a UH-1H helicopter.

  1. An XML Representation for Crew Procedures

    NASA Technical Reports Server (NTRS)

    Simpson, Richard C.

    2005-01-01

    NASA ensures safe operation of complex systems through the use of formally-documented procedures, which encode the operational knowledge of the system as derived from system experts. Crew members use procedure documentation on the ground for training purposes and on-board space shuttle and space station to guide their activities. Investigators at JSC are developing a new representation for procedures that is content-based (as opposed to display-based). Instead of specifying how a procedure should look on the printed page, the content-based representation will identify the components of a procedure and (more importantly) how the components are related (e.g., how the activities within a procedure are sequenced; what resources need to be available for each activity). This approach will allow different sets of rules to be created for displaying procedures on a computer screen, on a hand-held personal digital assistant (PDA), verbally, or on a printed page, and will also allow intelligent reasoning processes to automatically interpret and use procedure definitions. During his NASA fellowship, Dr. Simpson examined how various industries represent procedures (also called business processes or workflows), in areas such as manufacturing, accounting, shipping, or customer service. A useful method for designing and evaluating workflow representation languages is by determining their ability to encode various workflow patterns, which depict abstract relationships between the components of a procedure removed from the context of a specific procedure or industry. Investigators have used this type of analysis to evaluate how well-suited existing workflow representation languages are for various industries based on the workflow patterns that commonly arise across industry-specific procedures. Based on this type of analysis, it is already clear that existing workflow representations capture discrete flow of control (i.e., when one activity should start and stop based on when other activities start and stop), but do not capture the flow of data, materials, resources or priorities. Existing workflow representation languages are also limited to representing sequences of discrete activities, and cannot encode procedures involving continuous flow of information or materials between activities.

  2. 33 CFR 127.1109 - Lighting systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Lighting systems. 127.1109 Section 127.1109 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Hazardous Gas Design and Construction § 127.1109 Lighting systems...

  3. 33 CFR 127.1109 - Lighting systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Lighting systems. 127.1109 Section 127.1109 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Hazardous Gas Design and Construction § 127.1109 Lighting systems...

  4. 33 CFR 127.1109 - Lighting systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Lighting systems. 127.1109 Section 127.1109 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Hazardous Gas Design and Construction § 127.1109 Lighting systems...

  5. 33 CFR 127.1109 - Lighting systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Lighting systems. 127.1109 Section 127.1109 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Hazardous Gas Design and Construction § 127.1109 Lighting systems...

  6. 33 CFR 127.1109 - Lighting systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Lighting systems. 127.1109 Section 127.1109 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Hazardous Gas Design and Construction § 127.1109 Lighting systems...

  7. Knowledge Annotations in Scientific Workflows: An Implementation in Kepler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gandara, Aida G.; Chin, George; Pinheiro Da Silva, Paulo

    2011-07-20

    Abstract. Scientic research products are the result of long-term collaborations between teams. Scientic workfows are capable of helping scientists in many ways including the collection of information as to howresearch was conducted, e.g. scientic workfow tools often collect and manage information about datasets used and data transformations. However,knowledge about why data was collected is rarely documented in scientic workflows. In this paper we describe a prototype system built to support the collection of scientic expertise that infuences scientic analysis. Through evaluating a scientic research eort underway at Pacific Northwest National Laboratory, we identied features that would most benefit PNNL scientistsmore » in documenting how and why they conduct their research making this information available to the entire team. The prototype system was built by enhancing the Kepler Scientic Work-flow System to create knowledge-annotated scientic workfows and topublish them as semantic annotations.« less

  8. Internet's impact on publishing

    NASA Astrophysics Data System (ADS)

    Beretta, Giordano B.

    1997-04-01

    In 1990, the first monochrome print-on-demand (POD) systems wee successfully brought to market. Subsequent color versions have been less successful, in my view mostly because they require a different workflow than traditional systems and the highly skilled specialists have not been trained. This hypothesis is based on the observation that direct-to-plate systems for short run printing, which do not require a new workflow, are quite successful in the market place. The internet and the World Wide Web are the enabling technologies that are fostering a new print model that is very likely to replace color POD before the latter can establish itself. In this model the consumers locate the material they desire from a contents provider, pay through a digital cash clearinghouse, and print the material at their own cost on their local printer. All the basic technologies for this model are in place; the main challenge is to make the workflow sufficiently robust for individual use.

  9. Medical Devices Transition to Information Systems: Lessons Learned

    PubMed Central

    Charters, Kathleen G.

    2012-01-01

    Medical devices designed to network can share data with a Clinical Information System (CIS), making that data available within clinician workflow. Some lessons learned by transitioning anesthesia reporting and monitoring devices (ARMDs) on a local area network (LAN) to integration of anesthesia documentation within a CIS include the following categories: access, contracting, deployment, implementation, planning, security, support, training and workflow integration. Areas identified for improvement include: Vendor requirements for access reconciled with the organizations’ security policies and procedures. Include clauses supporting transition from stand-alone devices to information integrated into clinical workflow in the medical device procurement contract. Resolve deployment and implementation barriers that make the process less efficient and more costly. Include effective field communication and creative alternatives in planning. Build training on the baseline knowledge of trainees. Include effective help desk processes and metrics. Have a process for determining where problems originate when systems share information. PMID:24199054

  10. Integrating pathology and radiology disciplines: an emerging opportunity?

    PubMed Central

    2012-01-01

    Pathology and radiology form the core of cancer diagnosis, yet the workflows of both specialties remain ad hoc and occur in separate "silos," with no direct linkage between their case accessioning and/or reporting systems, even when both departments belong to the same host institution. Because both radiologists' and pathologists' data are essential to making correct diagnoses and appropriate patient management and treatment decisions, this isolation of radiology and pathology workflows can be detrimental to the quality and outcomes of patient care. These detrimental effects underscore the need for pathology and radiology workflow integration and for systems that facilitate the synthesis of all data produced by both specialties. With the enormous technological advances currently occurring in both fields, the opportunity has emerged to develop an integrated diagnostic reporting system that supports both specialties and, therefore, improves the overall quality of patient care. PMID:22950414

  11. Using AI and Semantic Web Technologies to attack Process Complexity in Open Systems

    NASA Astrophysics Data System (ADS)

    Thompson, Simon; Giles, Nick; Li, Yang; Gharib, Hamid; Nguyen, Thuc Duong

    Recently many vendors and groups have advocated using BPEL and WS-BPEL as a workflow language to encapsulate business logic. While encapsulating workflow and process logic in one place is a sensible architectural decision the implementation of complex workflows suffers from the same problems that made managing and maintaining hierarchical procedural programs difficult. BPEL lacks constructs for logical modularity such as the requirements construct from the STL [12] or the ability to adapt constructs like pure abstract classes for the same purpose. We describe a system that uses semantic web and agent concepts to implement an abstraction layer for BPEL based on the notion of Goals and service typing. AI planning was used to enable process engineers to create and validate systems that used services and goals as first class concepts and compiled processes at run time for execution.

  12. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  13. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. The anatomy of decision support during inpatient care provider order entry (CPOE): Empirical observations from a decade of CPOE experience at Vanderbilt

    PubMed Central

    Miller, Randolph A.; Waitman, Lemuel R.; Chen, Sutin; Rosenbloom, S. Trent

    2006-01-01

    The authors describe a pragmatic approach to the introduction of clinical decision support at the point of care, based on a decade of experience in developing and evolving Vanderbilt’s inpatient “WizOrder” care provider order entry (CPOE) system. The inpatient care setting provides a unique opportunity to interject CPOE-based decision support features that restructure clinical workflows, deliver focused relevant educational materials, and influence how care is delivered to patients. From their empirical observations, the authors have developed a generic model for decision support within inpatient CPOE systems. They believe that the model’s utility extends beyond Vanderbilt, because it is based on characteristics of end-user workflows and on decision support considerations that are common to a variety of inpatient settings and CPOE systems. The specific approach to implementing a given clinical decision support feature within a CPOE system should involve evaluation along three axes: what type of intervention to create (for which the authors describe 4 general categories); when to introduce the intervention into the user’s workflow (for which the authors present 7 categories), and how disruptive, during use of the system, the intervention might be to end-users’ workflows (for which the authors describe 6 categories). Framing decision support in this manner may help both developers and clinical end-users plan future alterations to their systems when needs for new decision support features arise. PMID:16290243

  15. Geo-navigation system for rotary percussion drilling in rocks of high and low electrical conductivity

    NASA Astrophysics Data System (ADS)

    Konurin, AI; Khmelinin, AP; Denisova, EV

    2018-03-01

    The currently available drill navigation systems, with their benefits and shortcomings are reviewed. A mathematical model is built to describe the inertial navigation system movement in horizontal and inclined drilling. A prototype model of the inertial navigation system for rotary percussion drills has been designed.

  16. Preliminary navigation accuracy analysis for the TDRSS Onboard Navigation System (TONS) experiment on EP/EUVE

    NASA Technical Reports Server (NTRS)

    Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.

    1991-01-01

    A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.

  17. Comparison of Precision between Optical and Electromagnetic Navigation Systems in Total Knee Arthroplasty

    PubMed Central

    Rhee, Seung Joon; Park, Shi Hwan; Cho, He Myung

    2014-01-01

    Purpose The purpose of this study is to compare and analyze the precision of optical and electromagnetic navigation systems in total knee arthroplasty (TKA). Materials and Methods We retrospectively reviewed 60 patients who underwent TKA using an optical navigation system and 60 patients who underwent TKA using an electromagnetic navigation system from June 2010 to March 2012. The mechanical axis that was measured on preoperative radiographs and by the intraoperative navigation systems were compared between the groups. The postoperative positions of the femoral and tibial components in the sagittal and coronal plane were assessed. Results The difference of the mechanical axis measured on the preoperative radiograph and by the intraoperative navigation systems was 0.6 degrees more varus in the electromagnetic navigation system group than in the optical navigation system group, but showed no statistically significant difference between the two groups (p>0.05). The positions of the femoral and tibial components in the sagittal and coronal planes on the postoperative radiographs also showed no statistically significant difference between the two groups (p>0.05). Conclusions In TKA, both optical and electromagnetic navigation systems showed high accuracy and reproducibility, and the measurements from the postoperative radiographs showed no significant difference between the two groups. PMID:25505703

  18. A Self-Tuning Kalman Filter for Autonomous Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, Son H.

    1999-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS (Global Positioning Systems) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.

  19. Automatic segmentation and centroid detection of skin sensors for lung interventions

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Xu, Sheng; Xue, Zhong; Wong, Stephen T.

    2012-02-01

    Electromagnetic (EM) tracking has been recognized as a valuable tool for locating the interventional devices in procedures such as lung and liver biopsy or ablation. The advantage of this technology is its real-time connection to the 3D volumetric roadmap, i.e. CT, of a patient's anatomy while the intervention is performed. EM-based guidance requires tracking of the tip of the interventional device, transforming the location of the device onto pre-operative CT images, and superimposing the device in the 3D images to assist physicians to complete the procedure more effectively. A key requirement of this data integration is to find automatically the mapping between EM and CT coordinate systems. Thus, skin fiducial sensors are attached to patients before acquiring the pre-operative CTs. Then, those sensors can be recognized in both CT and EM coordinate systems and used calculate the transformation matrix. In this paper, to enable the EM-based navigation workflow and reduce procedural preparation time, an automatic fiducial detection method is proposed to obtain the centroids of the sensors from the pre-operative CT. The approach has been applied to 13 rabbit datasets derived from an animal study and eight human images from an observation study. The numerical results show that it is a reliable and efficient method for use in EM-guided application.

  20. Navigation Operations for the Magnetospheric Multiscale Mission

    NASA Technical Reports Server (NTRS)

    Long, Anne; Farahmand, Mitra; Carpenter, Russell

    2015-01-01

    The Magnetospheric Multiscale (MMS) mission employs four identical spinning spacecraft flying in highly elliptical Earth orbits. These spacecraft will fly in a series of tetrahedral formations with separations of less than 10 km. MMS navigation operations use onboard navigation to satisfy the mission definitive orbit and time determination requirements and in addition to minimize operations cost and complexity. The onboard navigation subsystem consists of the Navigator GPS receiver with Goddard Enhanced Onboard Navigation System (GEONS) software, and an Ultra-Stable Oscillator. The four MMS spacecraft are operated from a single Mission Operations Center, which includes a Flight Dynamics Operations Area (FDOA) that supports MMS navigation operations, as well as maneuver planning, conjunction assessment and attitude ground operations. The System Manager component of the FDOA automates routine operations processes. The GEONS Ground Support System component of the FDOA provides the tools needed to support MMS navigation operations. This paper provides an overview of the MMS mission and associated navigation requirements and constraints and discusses MMS navigation operations and the associated MMS ground system components built to support navigation-related operations.

  1. Producing an Infrared Multiwavelength Galactic Plane Atlas Using Montage, Pegasus, and Amazon Web Services

    NASA Astrophysics Data System (ADS)

    Rynge, M.; Juve, G.; Kinney, J.; Good, J.; Berriman, B.; Merrihew, A.; Deelman, E.

    2014-05-01

    In this paper, we describe how to leverage cloud resources to generate large-scale mosaics of the galactic plane in multiple wavelengths. Our goal is to generate a 16-wavelength infrared Atlas of the Galactic Plane at a common spatial sampling of 1 arcsec, processed so that they appear to have been measured with a single instrument. This will be achieved by using the Montage image mosaic engine process observations from the 2MASS, GLIMPSE, MIPSGAL, MSX and WISE datasets, over a wavelength range of 1 μm to 24 μm, and by using the Pegasus Workflow Management System for managing the workload. When complete, the Atlas will be made available to the community as a data product. We are generating images that cover ±180° in Galactic longitude and ±20° in Galactic latitude, to the extent permitted by the spatial coverage of each dataset. Each image will be 5°x5° in size (including an overlap of 1° with neighboring tiles), resulting in an atlas of 1,001 images. The final size will be about 50 TBs. This paper will focus on the computational challenges, solutions, and lessons learned in producing the Atlas. To manage the computation we are using the Pegasus Workflow Management System, a mature, highly fault-tolerant system now in release 4.2.2 that has found wide applicability across many science disciplines. A scientific workflow describes the dependencies between the tasks and in most cases the workflow is described as a directed acyclic graph, where the nodes are tasks and the edges denote the task dependencies. A defining property for a scientific workflow is that it manages data flow between tasks. Applied to the galactic plane project, each 5 by 5 mosaic is a Pegasus workflow. Pegasus is used to fetch the source images, execute the image mosaicking steps of Montage, and store the final outputs in a storage system. As these workflows are very I/O intensive, care has to be taken when choosing what infrastructure to execute the workflow on. In our setup, we choose to use dynamically provisioned compute clusters running on the Amazon Elastic Compute Cloud (EC2). All our instances are using the same base image, which is configured to come up as a master node by default. The master node is a central instance from where the workflow can be managed. Additional worker instances are provisioned and configured to accept work assignments from the master node. The system allows for adding/removing workers in an ad hoc fashion, and could be run in large configurations. To-date we have performed 245,000 CPU hours of computing and generated 7,029 images and totaling 30 TB. With the current set up our runtime would be 340,000 CPU hours for the whole project. Using spot m2.4xlarge instances, the cost would be approximately $5,950. Using faster AWS instances, such as cc2.8xlarge could potentially decrease the total CPU hours and further reduce the compute costs. The paper will explore these tradeoffs.

  2. Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.

    2013-12-01

    The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.

  3. Structuring research methods and data with the research object model: genomics workflows as a case study.

    PubMed

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  4. 33 CFR 127.705 - Security systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Security systems. 127.705 Section 127.705 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Natural Gas Security § 127.705 Security systems. The operator shall...

  5. 33 CFR 127.705 - Security systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Security systems. 127.705 Section 127.705 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Natural Gas Security § 127.705 Security systems. The operator shall...

  6. 33 CFR 127.705 - Security systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Security systems. 127.705 Section 127.705 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Natural Gas Security § 127.705 Security systems. The operator shall...

  7. 33 CFR 127.705 - Security systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Security systems. 127.705 Section 127.705 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Natural Gas Security § 127.705 Security systems. The operator shall...

  8. 33 CFR 127.705 - Security systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Security systems. 127.705 Section 127.705 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Waterfront Facilities Handling Liquefied Natural Gas Security § 127.705 Security systems. The operator shall...

  9. A cognitive task analysis of a visual analytic workflow: Exploring molecular interaction networks in systems biology.

    PubMed

    Mirel, Barbara; Eichinger, Felix; Keller, Benjamin J; Kretzler, Matthias

    2011-03-21

    Bioinformatics visualization tools are often not robust enough to support biomedical specialists’ complex exploratory analyses. Tools need to accommodate the workflows that scientists actually perform for specific translational research questions. To understand and model one of these workflows, we conducted a case-based, cognitive task analysis of a biomedical specialist’s exploratory workflow for the question: What functional interactions among gene products of high throughput expression data suggest previously unknown mechanisms of a disease? From our cognitive task analysis four complementary representations of the targeted workflow were developed. They include: usage scenarios, flow diagrams, a cognitive task taxonomy, and a mapping between cognitive tasks and user-centered visualization requirements. The representations capture the flows of cognitive tasks that led a biomedical specialist to inferences critical to hypothesizing. We created representations at levels of detail that could strategically guide visualization development, and we confirmed this by making a trial prototype based on user requirements for a small portion of the workflow. Our results imply that visualizations should make available to scientific users “bundles of features” consonant with the compositional cognitive tasks purposefully enacted at specific points in the workflow. We also highlight certain aspects of visualizations that: (a) need more built-in flexibility; (b) are critical for negotiating meaning; and (c) are necessary for essential metacognitive support.

  10. Towards a Unified Architecture for Data-Intensive Seismology in VERCE

    NASA Astrophysics Data System (ADS)

    Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.

    2013-12-01

    Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.

  11. Scientific Workflows + Provenance = Better (Meta-)Data Management

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.

  12. Objectifying user critique. A means of continuous quality assurance for physician discharge letter composition.

    PubMed

    Oschem, M; Mahler, V; Prokosch, H U

    2011-01-01

    The aim of this study is to objectify user critique rendering it usable for quality assurance. Based on formative and summative evaluation results we strive to promote software improvements; in our case, the physician discharge letter composition process at the Department of Dermatology, University Hospital Erlangen, Germany. We developed a novel six-step approach to objectify user critique: 1) acquisition of user critique using subjectivist methods, 2) creation of a workflow model, 3) definition of hypothesis and indicators, 4) measuring of indicators, 5) analyzing results, 6) optimization of the system regarding both subjectivist and objectivist evaluation results. In particular, we derived indicators and workflows directly from user critique/narratives. The identified indicators were mapped onto workflow activities, creating a link between user critique and the evaluated system. Users criticized a new discharge letter system as "too slow" and "too labor-intensive" in comparison with the previously used system. In a stepwise approach we collected subjective user critique, derived a comprehensive process model including deviations and deduced a set of five indicators for objectivist evaluation: processing time, system-related waiting time, number of mouse clicks, number of keyboard inputs, and throughput time. About 3500 measurements have been performed to compare the workflow-steps of both systems, regarding 20 discharge letters. Although the difference of the mean total processing time between both systems was statistically insignificant (2011.7 s vs. 1971.5 s; p = 0.457), we detected a significant difference in waiting times (101.8 s vs. 37.2 s; p <0.001) and number of user interactions (77 vs. 69; p <0.001) in favor of the old system, thus objectifying user critique. Our six-step approach enables objectification of user critique, resulting in objective values for continuous quality assurance. To our knowledge no previous study in medical informatics mapped user critique onto workflow steps. Subjectivist analysis prompted us to use the indicator system-related waiting time for the objectivist study, which was rarely done before. We consider combining subjectivist and objectivist methods as a key point of our approach. Future work will concentrate on automated measurement of indicators.

  13. Combinatorial materials research applied to the development of new surface coatings VII: An automated system for adhesion testing

    NASA Astrophysics Data System (ADS)

    Chisholm, Bret J.; Webster, Dean C.; Bennett, James C.; Berry, Missy; Christianson, David; Kim, Jongsoo; Mayo, Bret; Gubbins, Nathan

    2007-07-01

    An automated, high-throughput adhesion workflow that enables pseudobarnacle adhesion and coating/substrate adhesion to be measured on coating patches arranged in an array format on 4×8in.2 panels was developed. The adhesion workflow consists of the following process steps: (1) application of an adhesive to the coating array; (2) insertion of panels into a clamping device; (3) insertion of aluminum studs into the clamping device and onto coating surfaces, aligned with the adhesive; (4) curing of the adhesive; and (5) automated removal of the aluminum studs. Validation experiments comparing data generated using the automated, high-throughput workflow to data obtained using conventional, manual methods showed that the automated system allows for accurate ranking of relative coating adhesion performance.

  14. Defining Usability Heuristics for Adoption and Efficiency of an Electronic Workflow Document Management System

    ERIC Educational Resources Information Center

    Fuentes, Steven

    2017-01-01

    Usability heuristics have been established for different uses and applications as general guidelines for user interfaces. These can affect the implementation of industry solutions and play a significant role regarding cost reduction and process efficiency. The area of electronic workflow document management (EWDM) solutions, also known as…

  15. Analyzing data flows of WLCG jobs at batch job level

    NASA Astrophysics Data System (ADS)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-05-01

    With the introduction of federated data access to the workflows of WLCG, it is becoming increasingly important for data centers to understand specific data flows regarding storage element accesses, firewall configurations, as well as the scheduling of batch jobs themselves. As existing batch system monitoring and related system monitoring tools do not support measurements at batch job level, a new tool has been developed and put into operation at the GridKa Tier 1 center for monitoring continuous data streams and characteristics of WLCG jobs and pilots. Long term measurements and data collection are in progress. These measurements already have been proven to be useful analyzing misbehaviors and various issues. Therefore we aim for an automated, realtime approach for anomaly detection. As a requirement, prototypes for standard workflows have to be examined. Based on measurements of several months, different features of HEP jobs are evaluated regarding their effectiveness for data mining approaches to identify these common workflows. The paper will introduce the actual measurement approach and statistics as well as the general concept and first results classifying different HEP job workflows derived from the measurements at GridKa.

  16. A recommended workflow methodology in the creation of an educational and training application incorporating a digital reconstruction of the cerebral ventricular system and cerebrospinal fluid circulation to aid anatomical understanding.

    PubMed

    Manson, Amy; Poyade, Matthieu; Rea, Paul

    2015-10-19

    The use of computer-aided learning in education can be advantageous, especially when interactive three-dimensional (3D) models are used to aid learning of complex 3D structures. The anatomy of the ventricular system of the brain is difficult to fully understand as it is seldom seen in 3D, as is the flow of cerebrospinal fluid (CSF). This article outlines a workflow for the creation of an interactive training tool for the cerebral ventricular system, an educationally challenging area of anatomy. This outline is based on the use of widely available computer software packages. Using MR images of the cerebral ventricular system and several widely available commercial and free software packages, the techniques of 3D modelling, texturing, sculpting, image editing and animations were combined to create a workflow in the creation of an interactive educational and training tool. This was focussed on cerebral ventricular system anatomy, and the flow of cerebrospinal fluid. We have successfully created a robust methodology by using key software packages in the creation of an interactive education and training tool. This has resulted in an application being developed which details the anatomy of the ventricular system, and flow of cerebrospinal fluid using an anatomically accurate 3D model. In addition to this, our established workflow pattern presented here also shows how tutorials, animations and self-assessment tools can also be embedded into the training application. Through our creation of an established workflow in the generation of educational and training material for demonstrating cerebral ventricular anatomy and flow of cerebrospinal fluid, it has enormous potential to be adopted into student training in this field. With the digital age advancing rapidly, this has the potential to be used as an innovative tool alongside other methodologies for the training of future healthcare practitioners and scientists. This workflow could be used in the creation of other tools, which could be developed for use not only on desktop and laptop computers but also smartphones, tablets and fully immersive stereoscopic environments. It also could form the basis on which to build surgical simulations enhanced with haptic interaction.

  17. Integrated Georeferencing of Stereo Image Sequences Captured with a Stereovision Mobile Mapping System - Approaches and Practical Results

    NASA Astrophysics Data System (ADS)

    Eugster, H.; Huber, F.; Nebiker, S.; Gisi, A.

    2012-07-01

    Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations - in our case of the imaging sensors - normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  18. Percutaneous needle placement using laser guidance: a practical solution

    NASA Astrophysics Data System (ADS)

    Xu, Sheng; Kapoor, Ankur; Abi-Jaoudeh, Nadine; Imbesi, Kimberly; Hong, Cheng William; Mazilu, Dumitru; Sharma, Karun; Venkatesan, Aradhana M.; Levy, Elliot; Wood, Bradford J.

    2013-03-01

    In interventional radiology, various navigation technologies have emerged aiming to improve the accuracy of device deployment and potentially the clinical outcomes of minimally invasive procedures. While these technologies' performance has been explored extensively, their impact on daily clinical practice remains undetermined due to the additional cost and complexity, modification of standard devices (e.g. electromagnetic tracking), and different levels of experience among physicians. Taking these factors into consideration, a robotic laser guidance system for percutaneous needle placement is developed. The laser guidance system projects a laser guide line onto the skin entry point of the patient, helping the physician to align the needle with the planned path of the preoperative CT scan. To minimize changes to the standard workflow, the robot is integrated with the CT scanner via optical tracking. As a result, no registration between the robot and CT is needed. The robot can compensate for the motion of the equipment and keep the laser guide line aligned with the biopsy path in real-time. Phantom experiments showed that the guidance system can benefit physicians at different skill levels, while clinical studies showed improved accuracy over conventional freehand needle insertion. The technology is safe, easy to use, and does not involve additional disposable costs. It is our expectation that this technology can be accepted by interventional radiologists for CT guided needle placement procedures.

  19. A Railway Track Geometry Measuring Trolley System Based on Aided INS

    PubMed Central

    Chen, Qijin; Niu, Xiaoji; Zuo, Lili; Zhang, Tisheng; Xiao, Fuqin; Liu, Yi; Liu, Jingnan

    2018-01-01

    Accurate measurement of the railway track geometry is a task of fundamental importance to ensure the track quality in both the construction phase and the regular maintenance stage. Conventional track geometry measuring trolleys (TGMTs) in combination with classical geodetic surveying apparatus such as total stations alone cannot meet the requirements of measurement accuracy and surveying efficiency at the same time. Accurate and fast track geometry surveying applications call for an innovative surveying method that can measure all or most of the track geometric parameters in short time without interrupting the railway traffic. We provide a novel solution to this problem by integrating an inertial navigation system (INS) with a geodetic surveying apparatus, and design a modular TGMT system based on aided INS, which can be configured according to different surveying tasks including precise adjustment of slab track, providing tamping measurements, measuring track deformation and irregularities, and determination of the track axis. TGMT based on aided INS can operate in mobile surveying mode to significantly improve the surveying efficiency. Key points in the design of the TGMT’s architecture and the data processing concept and workflow are introduced in details, which should benefit subsequent research and provide a reference for the implementation of this kind of TGMT. The surveying performance of proposed TGMT with different configurations is assessed in the track geometry surveying experiments and actual projects. PMID:29439423

  20. 46 CFR 112.43-7 - Navigating bridge distribution panel.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Navigating bridge distribution panel. 112.43-7 Section... EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Lighting Systems § 112.43-7 Navigating bridge distribution... supplied from a distribution panel on the navigating bridge: (1) Navigation lights not supplied by the...

  1. 33 CFR 62.63 - Recommendations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Recommendations. 62.63 Section 62... UNITED STATES AIDS TO NAVIGATION SYSTEM Public Participation in the Aids to Navigation System § 62.63 Recommendations. (a) The public may recommend changes to existing aids to navigation, request new aids or the...

  2. 46 CFR 112.43-7 - Navigating bridge distribution panel.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Navigating bridge distribution panel. 112.43-7 Section... EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Lighting Systems § 112.43-7 Navigating bridge distribution... supplied from a distribution panel on the navigating bridge: (1) Navigation lights not supplied by the...

  3. NFC Internal: An Indoor Navigation System

    PubMed Central

    Ozdenizci, Busra; Coskun, Vedat; Ok, Kerem

    2015-01-01

    Indoor navigation systems have recently become a popular research field due to the lack of GPS signals indoors. Several indoors navigation systems have already been proposed in order to eliminate deficiencies; however each of them has several technical and usability limitations. In this study, we propose NFC Internal, a Near Field Communication (NFC)-based indoor navigation system, which enables users to navigate through a building or a complex by enabling a simple location update, simply by touching NFC tags those are spread around and orient users to the destination. In this paper, we initially present the system requirements, give the design details and study the viability of NFC Internal with a prototype application and a case study. Moreover, we evaluate the performance of the system and compare it with existing indoor navigation systems. It is seen that NFC Internal has considerable advantages and significant contributions to existing indoor navigation systems in terms of security and privacy, cost, performance, robustness, complexity, user preference and commercial availability. PMID:25825976

  4. A Self-Tuning Kalman Filter for Autonomous Navigation using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.

    1999-01-01

    Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.

  5. Relative Navigation of Formation Flying Satellites

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)

    2002-01-01

    The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.

  6. 33 CFR 164.43 - Automatic Identification System Shipborne Equipment-Prince William Sound.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Automatic Identification System Shipborne Equipment-Prince William Sound. 164.43 Section 164.43 Navigation and Navigable Waters COAST GUARD... Automatic Identification System Shipborne Equipment—Prince William Sound. (a) Until December 31, 2004, each...

  7. 33 CFR 164.43 - Automatic Identification System Shipborne Equipment-Prince William Sound.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Automatic Identification System Shipborne Equipment-Prince William Sound. 164.43 Section 164.43 Navigation and Navigable Waters COAST GUARD... Automatic Identification System Shipborne Equipment—Prince William Sound. (a) Until December 31, 2004, each...

  8. 33 CFR 164.43 - Automatic Identification System Shipborne Equipment-Prince William Sound.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Automatic Identification System Shipborne Equipment-Prince William Sound. 164.43 Section 164.43 Navigation and Navigable Waters COAST GUARD... Automatic Identification System Shipborne Equipment—Prince William Sound. (a) Until December 31, 2004, each...

  9. Compensation of Horizontal Gravity Disturbances for High Precision Inertial Navigation

    PubMed Central

    Cao, Juliang; Wu, Meiping; Lian, Junxiang; Cai, Shaokun; Wang, Lin

    2018-01-01

    Horizontal gravity disturbances are an important factor that affects the accuracy of inertial navigation systems in long-duration ship navigation. In this paper, from the perspective of the coordinate system and vector calculation, the effects of horizontal gravity disturbance on the initial alignment and navigation calculation are simultaneously analyzed. Horizontal gravity disturbances cause the navigation coordinate frame built in initial alignment to not be consistent with the navigation coordinate frame in which the navigation calculation is implemented. The mismatching of coordinate frame violates the vector calculation law, which will have an adverse effect on the precision of the inertial navigation system. To address this issue, two compensation methods suitable for two different navigation coordinate frames are proposed, one of the methods implements the compensation in velocity calculation, and the other does the compensation in attitude calculation. Finally, simulations and ship navigation experiments confirm the effectiveness of the proposed methods. PMID:29562653

  10. Information Engineering and Workflow Design in a Clinical Decision Support System for Colorectal Cancer Screening in Iran.

    PubMed

    Maserat, Elham; Seied Farajollah, Seiede Sedigheh; Safdari, Reza; Ghazisaeedi, Marjan; Aghdaei, Hamid Asadzadeh; Zali, Mohammad Reza

    2015-01-01

    Colorectal cancer is a major cause of morbidity and mortality throughout the world. Colorectal cancer screening is an optimal way for reducing of morbidity and mortality and a clinical decision support system (CDSS) plays an important role in predicting success of screening processes. DSS is a computer-based information system that improves the delivery of preventive care services. The aim of this article was to detail engineering of information requirements and work flow design of CDSS for a colorectal cancer screening program. In the first stage a screening minimum data set was determined. Developed and developing countries were analyzed for identifying this data set. Then information deficiencies and gaps were determined by check list. The second stage was a qualitative survey with a semi-structured interview as the study tool. A total of 15 users and stakeholders' perspectives about workflow of CDSS were studied. Finally workflow of DSS of control program was designed by standard clinical practice guidelines and perspectives. Screening minimum data set of national colorectal cancer screening program was defined in five sections, including colonoscopy data set, surgery, pathology, genetics and pedigree data set. Deficiencies and information gaps were analyzed. Then we designed a work process standard of screening. Finally workflow of DSS and entry stage were determined. A CDSS facilitates complex decision making for screening and has key roles in designing optimal interactions between colonoscopy, pathology and laboratory departments. Also workflow analysis is useful to identify data reconciliation strategies to address documentation gaps. Following recommendations of CDSS should improve quality of colorectal cancer screening.

  11. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    PubMed Central

    Mathew, Cherian; Obst, Matthias; Vicario, Saverio; Haines, Robert; Williams, Alan R.; de Jong, Yde; Goble, Carole

    2014-01-01

    Abstract The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users. PMID:25535486

  12. Routine Digital Pathology Workflow: The Catania Experience

    PubMed Central

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Introduction: Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory. PMID:29416914

  13. Routine Digital Pathology Workflow: The Catania Experience.

    PubMed

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.

  14. SigWin-detector: a Grid-enabled workflow for discovering enriched windows of genomic features related to DNA sequences.

    PubMed

    Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M

    2008-08-08

    Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.

  15. Interplay between Clinical Guidelines and Organizational Workflow Systems. Experience from the MobiGuide Project.

    PubMed

    Shabo, Amnon; Peleg, Mor; Parimbelli, Enea; Quaglini, Silvana; Napolitano, Carlo

    2016-12-07

    Implementing a decision-support system within a healthcare organization requires integration of clinical domain knowledge with resource constraints. Computer-interpretable guidelines (CIG) are excellent instruments for addressing clinical aspects while business process management (BPM) languages and Workflow (Wf) engines manage the logistic organizational constraints. Our objective is the orchestration of all the relevant factors needed for a successful execution of patient's care pathways, especially when spanning the continuum of care, from acute to community or home care. We considered three strategies for integrating CIGs with organizational workflows: extending the CIG or BPM languages and their engines, or creating an interplay between them. We used the interplay approach to implement a set of use cases arising from a CIG implementation in the domain of Atrial Fibrillation. To provide a more scalable and standards-based solution, we explored the use of Cross-Enterprise Document Workflow Integration Profile. We describe our proof-of-concept implementation of five use cases. We utilized the Personal Health Record of the MobiGuide project to implement a loosely-coupled approach between the Activiti BPM engine and the Picard CIG engine. Changes in the PHR were detected by polling. IHE profiles were used to develop workflow documents that orchestrate cross-enterprise execution of cardioversion. Interplay between CIG and BPM engines can support orchestration of care flows within organizational settings.

  16. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  17. 33 CFR 62.51 - Western Rivers Marking System.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.51 Western Rivers... toward the Gulf of Mexico. (b) The Western Rivers System varies from the standard U.S. system as follows...

  18. 33 CFR 62.51 - Western Rivers Marking System.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.51 Western Rivers... toward the Gulf of Mexico. (b) The Western Rivers System varies from the standard U.S. system as follows...

  19. Computer-assisted navigation in orthopedic surgery.

    PubMed

    Mavrogenis, Andreas F; Savvidou, Olga D; Mimidis, George; Papanastasiou, John; Koulalis, Dimitrios; Demertzis, Nikolaos; Papagelopoulos, Panayiotis J

    2013-08-01

    Computer-assisted navigation has a role in some orthopedic procedures. It allows the surgeons to obtain real-time feedback and offers the potential to decrease intra-operative errors and optimize the surgical result. Computer-assisted navigation systems can be active or passive. Active navigation systems can either perform surgical tasks or prohibit the surgeon from moving past a predefined zone. Passive navigation systems provide intraoperative information, which is displayed on a monitor, but the surgeon is free to make any decisions he or she deems necessary. This article reviews the available types of computer-assisted navigation, summarizes the clinical applications and reviews the results of related series using navigation, and informs surgeons of the disadvantages and pitfalls of computer-assisted navigation in orthopedic surgery. Copyright 2013, SLACK Incorporated.

  20. NASA tracking ship navigation systems

    NASA Technical Reports Server (NTRS)

    Mckenna, J. J.

    1976-01-01

    The ship position and attitude measurement system that was installed aboard the tracking ship Vanguard is described. An overview of the entire system is given along with a description of how precise time and frequency is utilized. The instrumentation is broken down into its basic components. Particular emphasis is given to the inertial navigation system. Each navigation system used, a mariner star tracker, navigation satellite system, Loran C and OMEGA in conjunction with the inertial system is described. The accuracy of each system is compared along with their limitations.

  1. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    NASA Astrophysics Data System (ADS)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).

  2. An Agent-Based Model for Navigation Simulation in a Heterogeneous Environment

    ERIC Educational Resources Information Center

    Shanklin, Teresa A.

    2012-01-01

    Complex navigation (e.g. indoor and outdoor environments) can be studied as a system-of-systems problem. The model is made up of disparate systems that can aid a user in navigating from one location to another, utilizing whatever sensor system or information is available. By using intelligent navigation sensors and techniques (e.g. RFID, Wifi,…

  3. 33 CFR 62.47 - Sound signals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Sound signals. 62.47 Section 62... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.47 Sound signals. (a) Often sound signals are located on or adjacent to aids to navigation. When visual signals are obscured...

  4. 33 CFR 62.43 - Numbers and letters.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Section 62.43 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.43 Numbers and letters. (a) All solid red and solid green aids are numbered, with red aids bearing even numbers and green...

  5. 33 CFR 62.43 - Numbers and letters.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Section 62.43 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.43 Numbers and letters. (a) All solid red and solid green aids are numbered, with red aids bearing even numbers and green...

  6. 33 CFR 62.43 - Numbers and letters.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Section 62.43 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.43 Numbers and letters. (a) All solid red and solid green aids are numbered, with red aids bearing even numbers and green...

  7. 33 CFR 62.43 - Numbers and letters.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Section 62.43 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY AIDS TO NAVIGATION UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.43 Numbers and letters. (a) All solid red and solid green aids are numbered, with red aids bearing even numbers and green...

  8. 33 CFR 62.47 - Sound signals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Sound signals. 62.47 Section 62... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.47 Sound signals. (a) Often sound signals are located on or adjacent to aids to navigation. When visual signals are obscured...

  9. 33 CFR 62.47 - Sound signals.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Sound signals. 62.47 Section 62... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.47 Sound signals. (a) Often sound signals are located on or adjacent to aids to navigation. When visual signals are obscured...

  10. 33 CFR 62.47 - Sound signals.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Sound signals. 62.47 Section 62... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.47 Sound signals. (a) Often sound signals are located on or adjacent to aids to navigation. When visual signals are obscured...

  11. 33 CFR 62.47 - Sound signals.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Sound signals. 62.47 Section 62... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.47 Sound signals. (a) Often sound signals are located on or adjacent to aids to navigation. When visual signals are obscured...

  12. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less

  13. A strategy for systemic toxicity assessment based on non-animal approaches: The Cosmetics Europe Long Range Science Strategy programme.

    PubMed

    Desprez, Bertrand; Dent, Matt; Keller, Detlef; Klaric, Martina; Ouédraogo, Gladys; Cubberley, Richard; Duplan, Hélène; Eilstein, Joan; Ellison, Corie; Grégoire, Sébastien; Hewitt, Nicola J; Jacques-Jamin, Carine; Lange, Daniela; Roe, Amy; Rothe, Helga; Blaauboer, Bas J; Schepky, Andreas; Mahony, Catherine

    2018-08-01

    When performing safety assessment of chemicals, the evaluation of their systemic toxicity based only on non-animal approaches is a challenging objective. The Safety Evaluation Ultimately Replacing Animal Test programme (SEURAT-1) addressed this question from 2011 to 2015 and showed that further research and development of adequate tools in toxicokinetic and toxicodynamic are required for performing non-animal safety assessments. It also showed how to implement tools like thresholds of toxicological concern (TTCs) and read-across in this context. This paper shows a tiered scientific workflow and how each tier addresses the four steps of the risk assessment paradigm. Cosmetics Europe established its Long Range Science Strategy (LRSS) programme, running from 2016 to 2020, based on the outcomes of SEURAT-1 to implement this workflow. Dedicated specific projects address each step of this workflow, which is introduced here. It tackles the question of evaluating the internal dose when systemic exposure happens. The applicability of the workflow will be shown through a series of case studies, which will be published separately. Even if the LRSS puts the emphasis on safety assessment of cosmetic relevant chemicals, it remains applicable to any type of chemical. Copyright © 2018. Published by Elsevier Ltd.

  14. Synergies in Astrometry: Predicting Navigational Error of Visual Binary Stars

    NASA Astrophysics Data System (ADS)

    Gessner Stewart, Susan

    2015-08-01

    Celestial navigation can employ a number of bright stars which are in binary systems. Often these are unresolved, appearing as a single, center-of-light object. A number of these systems are, however, in wide systems which could introduce a margin of error in the navigation solution if not handled properly. To illustrate the importance of good orbital solutions for binary systems - as well as good astrometry in general - the relationship between the center-of-light versus individual catalog position of celestial bodies and the error in terrestrial position derived via celestial navigation is demonstrated. From the list of navigational binary stars, fourteen such binary systems with at least 3.0 arcseconds apparent separation are explored. Maximum navigational error is estimated under the assumption that the bright star in the pair is observed at maximum separation, but the center-of-light is employed in the navigational solution. The relationships between navigational error and separation, orbital periods, and observers' latitude are discussed.

  15. CoSEC: Connecting Living With a Star Research

    NASA Astrophysics Data System (ADS)

    Hurlburt, N.; Freeland, S.; Bose, P.; Zimdars, A.; Slater, G.

    2006-12-01

    The Collaborative Sun-Earth Connector (CoSEC) provide the means for heliophysics researchers to compose the data sources and processing services published by their peers into processing workflows that reliably generate publication-worthy data. It includes: composition of computational and data services into easy-to- read workflows with data quality and version traceability; straightforward translation of existing services into workflow components, and advertisement of those components to other members of the CoSEC community; annotation of published services with functional attributes to enable discovery of capabilities required by particular workflows and identify peer subgroups in the CoSEC community; and annotation of published services with nonfunctional attributes to enable selection on the basis of quality of service (QoS). We present an overview and demonstration of the CoSEC system, discuss applications, the lessons learned and future developments.

  16. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    PubMed

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  17. Performing statistical analyses on quantitative data in Taverna workflows: an example using R and maxdBrowse to identify differentially-expressed genes from microarray data.

    PubMed

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-08-07

    There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data.

  18. Performing statistical analyses on quantitative data in Taverna workflows: An example using R and maxdBrowse to identify differentially-expressed genes from microarray data

    PubMed Central

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-01-01

    Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data. PMID:18687127

  19. Novel cemented cup-holding technique while performing total hip arthroplasty with navigation system.

    PubMed

    Takai, Hirokazu; Takahashi, Tomoki

    2017-09-01

    Recently, navigation systems have been more widely utilized in total hip arthroplasty. However, almost all of these systems have been developed for cementless cups. In the case of cemented total hip arthroplasty using a navigation system, a special-ordered cemented holder is needed. We propose a novel cemented cup-holding technique for navigation systems using readily available articles. We combine a cementless cup holder with an inverted cementless trial cup. The resulting apparatus is used as a cemented cup holder. The upside-down cup-holding technique is useful and permits cemented cup users to utilize a navigation system for placement of the acetabular component.

  20. MetaNET--a web-accessible interactive platform for biological metabolic network analysis.

    PubMed

    Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael

    2014-01-01

    Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.

  1. Evaluation of Standardization of Transfer of Accountability between Inpatient Pharmacists.

    PubMed

    Tsoi, Vivian; Dewhurst, Norman; Tom, Elaine

    2018-01-01

    A compelling body of evidence supports the notion that transfer of accountability (TOA) improves communication, continuity of care, and patient safety. TOA involves the transmission and receipt of information between clinicians at each transition of care. Without a notification system alerting pharmacists to patient transfers, pharmacists' ability to seek out and complete TOA may be hindered. A standardized policy and process for TOA, with automated workflow, was implemented at the study hospital in 2015, to ensure consistency and timeliness of documentation by pharmacists. To evaluate pharmacists' adherence to and satisfaction with the TOA policy and process. A retrospective audit was conducted, using a random sample of individuals who were inpatients between June 2014 and February 2016. Transition points for TOA were identified, and the computerized pharmacy system was reviewed to determine whether TOA had been documented at each transition point. After the audit, an online survey was distributed to assess pharmacists' response to and satisfaction with the TOA policy and workflow. Before the TOA workflow was implemented, TOA documentation by pharmacists ranged from 11% (10/93) to 43% (48/111) of transitions. Eight months after implementation of the workflow, the rate of TOA documentation was 87% (68/78), exceeding the institution's target of 70%. Of the 32 pharmacists surveyed, most were satisfied with the TOA policy and agreed that the standardized workflow was simple to use, increased the number of TOAs provided and received, and improved the quality of completed TOAs. Respondents also indicated that the TOA workflow had improved patient care (mean score 4.09/5, standard deviation 0.64). The standardized TOA policy and process were well received by pharmacists, and resulted in consistent TOA documentation and a TOA documentation rate that exceeded the institutional target.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheu, R; Ghafar, R; Powers, A

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determinedmore » by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.« less

  3. "torino 1911" Project: a Contribution of a Slam-Based Survey to Extensive 3d Heritage Modeling

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Della Coletta, C.; Sammartano, G.; Spanò, A.; Spreafico, A.

    2018-05-01

    In the framework of the digital documentation of complex environments the advanced Geomatics researches offers integrated solution and multi-sensor strategies for the 3D accurate reconstruction of stratified structures and articulated volumes in the heritage domain. The use of handheld devices for rapid mapping, both image- and range-based, can help the production of suitable easy-to use and easy-navigable 3D model for documentation projects. These types of reality-based modelling could support, with their tailored integrated geometric and radiometric aspects, valorisation and communication projects including virtual reconstructions, interactive navigation settings, immersive reality for dissemination purposes and evoking past places and atmospheres. The aim of this research is localized within the "Torino 1911" project, led by the University of San Diego (California) in cooperation with the PoliTo. The entire project is conceived for multi-scale reconstruction of the real and no longer existing structures in the whole park space of more than 400,000 m2, for a virtual and immersive visualization of the Turin 1911 International "Fabulous Exposition" event, settled in the Valentino Park. Particularly, in the presented research, a 3D metric documentation workflow is proposed and validated in order to integrate the potentialities of LiDAR mapping by handheld SLAM-based device, the ZEB REVO Real Time instrument by GeoSLAM (2017 release), instead of TLS consolidated systems. Starting from these kind of models, the crucial aspects of the trajectories performances in the 3D reconstruction and the radiometric content from imaging approaches are considered, specifically by means of compared use of common DSLR cameras and portable sensors.

  4. A Strapdown Interial Navigation System/Beidou/Doppler Velocity Log Integrated Navigation Algorithm Based on a Cubature Kalman Filter

    PubMed Central

    Gao, Wei; Zhang, Ya; Wang, Jianguo

    2014-01-01

    The integrated navigation system with strapdown inertial navigation system (SINS), Beidou (BD) receiver and Doppler velocity log (DVL) can be used in marine applications owing to the fact that the redundant and complementary information from different sensors can markedly improve the system accuracy. However, the existence of multisensor asynchrony will introduce errors into the system. In order to deal with the problem, conventionally the sampling interval is subdivided, which increases the computational complexity. In this paper, an innovative integrated navigation algorithm based on a Cubature Kalman filter (CKF) is proposed correspondingly. A nonlinear system model and observation model for the SINS/BD/DVL integrated system are established to more accurately describe the system. By taking multi-sensor asynchronization into account, a new sampling principle is proposed to make the best use of each sensor's information. Further, CKF is introduced in this new algorithm to enable the improvement of the filtering accuracy. The performance of this new algorithm has been examined through numerical simulations. The results have shown that the positional error can be effectively reduced with the new integrated navigation algorithm. Compared with the traditional algorithm based on EKF, the accuracy of the SINS/BD/DVL integrated navigation system is improved, making the proposed nonlinear integrated navigation algorithm feasible and efficient. PMID:24434842

  5. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    PubMed

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Automatic system testing of a decision support system for insulin dosing using Google Android.

    PubMed

    Spat, Stephan; Höll, Bernhard; Petritsch, Georg; Schaupp, Lukas; Beck, Peter; Pieber, Thomas R

    2013-01-01

    Hyperglycaemia in hospitalized patients is a common and costly health care problem. The GlucoTab system is a mobile workflow and decision support system, aiming to facilitate efficient and safe glycemic control of non-critically ill patients. Being a medical device, the GlucoTab requires extensive and reproducible testing. A framework for high-volume, reproducible and automated system testing of the GlucoTab system was set up applying several Open Source tools for test automation and system time handling. The REACTION insulin titration protocol was investigated in a paper-based clinical trial (PBCT). In order to validate the GlucoTab system, data from this trial was used for simulation and system tests. In total, 1190 decision support action points were identified and simulated. Four data points (0.3%) resulted in a GlucoTab system error caused by a defective implementation. In 144 data points (12.1%), calculation errors of physicians and nurses in the PBCT were detected. The test framework was able to verify manual calculation of insulin doses and detect relatively many user errors and workflow anomalies in the PBCT data. This shows the high potential of the electronic decision support application to improve safety of implementation of an insulin titration protocol and workflow management system in clinical wards.

  7. 33 CFR 62.27 - Safe water marks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Safe water marks. 62.27 Section... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.27 Safe water marks. Safe water marks indicate that there is navigable water all around the mark. They are often used to indicate...

  8. 33 CFR 62.27 - Safe water marks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Safe water marks. 62.27 Section... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.27 Safe water marks. Safe water marks indicate that there is navigable water all around the mark. They are often used to indicate...

  9. 33 CFR 62.27 - Safe water marks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Safe water marks. 62.27 Section... UNITED STATES AIDS TO NAVIGATION SYSTEM The U.S. Aids to Navigation System § 62.27 Safe water marks. Safe water marks indicate that there is navigable water all around the mark. They are often used to indicate...

  10. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  11. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  12. Software for rapid prototyping in the pharmaceutical and biotechnology industries.

    PubMed

    Kappler, Michael A

    2008-05-01

    The automation of drug discovery methods continues to develop, especially techniques that process information, represent workflow and facilitate decision-making. The magnitude of data and the plethora of questions in pharmaceutical and biotechnology research give rise to the need for rapid prototyping software. This review describes the advantages and disadvantages of three solutions: Competitive Workflow, Taverna and Pipeline Pilot. Each of these systems processes large amounts of data, integrates diverse systems and assists novice programmers and human experts in critical decision-making steps.

  13. A network collaboration implementing technology to improve medication dispensing and administration in critical access hospitals.

    PubMed

    Wakefield, Douglas S; Ward, Marcia M; Loes, Jean L; O'Brien, John

    2010-01-01

    We report how seven independent critical access hospitals collaborated with a rural referral hospital to standardize workflow policies and procedures while jointly implementing the same health information technologies (HITs) to enhance medication care processes. The study hospitals implemented the same electronic health record, computerized provider order entry, pharmacy information systems, automated dispensing cabinets (ADC), and barcode medication administration systems. We conducted interviews and examined project documents to explore factors underlying the successful implementation of ADC and barcode medication administration across the network hospitals. These included a shared culture of collaboration; strategic sequencing of HIT component implementation; interface among HIT components; strategic placement of ADCs; disciplined use and sharing of workflow analyses linked with HIT applications; planning for workflow efficiencies; acquisition of adequate supply of HIT-related devices; and establishing metrics to monitor HIT use and outcomes.

  14. A patient workflow management system built on guidelines.

    PubMed Central

    Dazzi, L.; Fassino, C.; Saracco, R.; Quaglini, S.; Stefanelli, M.

    1997-01-01

    To provide high quality, shared, and distributed medical care, clinical and organizational issues need to be integrated. This work describes a methodology for developing a Patient Workflow Management System, based on a detailed model of both the medical work process and the organizational structure. We assume that the medical work process is represented through clinical practice guidelines, and that an ontological description of the organization is available. Thus, we developed tools 1) for acquiring the medical knowledge contained into a guideline, 2) to translate the derived formalized guideline into a computational formalism, precisely a Petri Net, 3) to maintain different representation levels. The high level representation guarantees that the Patient Workflow follows the guideline prescriptions, while the low level takes into account the specific organization characteristics and allow allocating resources for managing a specific patient in daily practice. PMID:9357606

  15. The development of a white cane which navigates the visually impaired.

    PubMed

    Shiizu, Yuriko; Hirahara, Yoshiaki; Yanashima, Kenji; Magatani, Kazushige

    2007-01-01

    In this paper, we describe about a developed navigation system that supports the independent walking of the visually impaired in the indoor space. This system is composed of colored navigation lines, RFID tags and an intelligent white cane. In our system, some colored marking tapes are set on along the walking route. These lines are called navigation line. And also RFID tags are set on this line at each landmark point. The intelligent white cane can sense a color of navigation line and receive tag information. By vibration of white cane, the system informs the visually impaired that he/she is walking along the navigation line. At the landmark point, the system also notifies area information to him/her by pre-recorded voice. Ten normal subjects who were blind folded with an eye mask were tested with this system. All of them were able to walk along the navigation line. The performance of the area information system was good. Therefore, we have concluded that our system will be extremely valuable in supporting the activities of the visually impaired.

  16. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    NASA Astrophysics Data System (ADS)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  17. Navigation technique for MR-endoscope system using a wireless accelerometer-based remote control device.

    PubMed

    Kumamoto, Etsuko; Takahashi, Akihiro; Matsuoka, Yuichiro; Morita, Yoshinori; Kutsumi, Hiromu; Azuma, Takeshi; Kuroda, Kagayaki

    2013-01-01

    The MR-endoscope system can perform magnetic resonance (MR) imaging during endoscopy and show the images obtained by using endoscope and MR. The MR-endoscope system can acquire a high-spatial resolution MR image with an intraluminal radiofrequency (RF) coil, and the navigation system shows the scope's location and orientation inside the human body and indicates MR images with a scope view. In order to conveniently perform an endoscopy and MR procedure, the design of the user interface is very important because it provides useful information. In this study, we propose a navigation system using a wireless accelerometer-based controller with Bluetooth technology and a navigation technique to set the intraluminal RF coil using the navigation system. The feasibility of using this wireless controller in the MR shield room was validated via phantom examinations of the influence on MR procedures and navigation accuracy. In vitro examinations using an isolated porcine stomach demonstrated the effectiveness of the navigation technique using a wireless remote-control device.

  18. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    PubMed

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  20. 33 CFR 169.100 - What mandatory ship reporting systems are established by this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false What mandatory ship reporting systems are established by this subpart? 169.100 Section 169.100 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) PORTS AND WATERWAYS SAFETY SHIP REPORTING SYSTEMS Establishment of Two Mandatory Ship Reporting...

Top