Sample records for guidance workflow implications

  1. Techniques for Interventional MRI Guidance in Closed-Bore Systems.

    PubMed

    Busse, Harald; Kahn, Thomas; Moche, Michael

    2018-02-01

    Efficient image guidance is the basis for minimally invasive interventions. In comparison with X-ray, computed tomography (CT), or ultrasound imaging, magnetic resonance imaging (MRI) provides the best soft tissue contrast without ionizing radiation and is therefore predestined for procedural control. But MRI is also characterized by spatial constraints, electromagnetic interactions, long imaging times, and resulting workflow issues. Although many technical requirements have been met over the years-most notably magnetic resonance (MR) compatibility of tools, interventional pulse sequences, and powerful processing hardware and software-there is still a large variety of stand-alone devices and systems for specific procedures only.Stereotactic guidance with the table outside the magnet is common and relies on proper registration of the guiding grids or manipulators to the MR images. Instrument tracking, often by optical sensing, can be added to provide the physicians with proper eye-hand coordination during their navigated approach. Only in very short wide-bore systems, needles can be advanced at the extended arm under near real-time imaging. In standard magnets, control and workflow may be improved by remote operation using robotic or manual driving elements.This work highlights a number of devices and techniques for different interventional settings with a focus on percutaneous, interstitial procedures in different organ regions. The goal is to identify technical and procedural elements that might be relevant for interventional guidance in a broader context, independent of the clinical application given here. Key challenges remain the seamless integration into the interventional workflow, safe clinical translation, and proper cost effectiveness.

  2. A fully actuated robotic assistant for MRI-guided prostate biopsy and brachytherapy

    NASA Astrophysics Data System (ADS)

    Li, Gang; Su, Hao; Shang, Weijian; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare M.; Fischer, Gregory S.

    2013-03-01

    Intra-operative medical imaging enables incorporation of human experience and intelligence in a controlled, closed-loop fashion. Magnetic resonance imaging (MRI) is an ideal modality for surgical guidance of diagnostic and therapeutic procedures, with its ability to perform high resolution, real-time, high soft tissue contrast imaging without ionizing radiation. However, for most current image-guided approaches only static pre-operative images are accessible for guidance, which are unable to provide updated information during a surgical procedure. The high magnetic field, electrical interference, and limited access of closed-bore MRI render great challenges to developing robotic systems that can perform inside a diagnostic high-field MRI while obtaining interactively updated MR images. To overcome these limitations, we are developing a piezoelectrically actuated robotic assistant for actuated percutaneous prostate interventions under real-time MRI guidance. Utilizing a modular design, the system enables coherent and straight forward workflow for various percutaneous interventions, including prostate biopsy sampling and brachytherapy seed placement, using various needle driver configurations. The unified workflow compromises: 1) system hardware and software initialization, 2) fiducial frame registration, 3) target selection and motion planning, 4) moving to the target and performing the intervention (e.g. taking a biopsy sample) under live imaging, and 5) visualization and verification. Phantom experiments of prostate biopsy and brachytherapy were executed under MRI-guidance to evaluate the feasibility of the workflow. The robot successfully performed fully actuated biopsy sampling and delivery of simulated brachytherapy seeds under live MR imaging, as well as precise delivery of a prostate brachytherapy seed distribution with an RMS accuracy of 0.98mm.

  3. [Measures to prevent patient identification errors in blood collection/physiological function testing utilizing a laboratory information system].

    PubMed

    Shimazu, Chisato; Hoshino, Satoshi; Furukawa, Taiji

    2013-08-01

    We constructed an integrated personal identification workflow chart using both bar code reading and an all in-one laboratory information system. The information system not only handles test data but also the information needed for patient guidance in the laboratory department. The reception terminals at the entrance, displays for patient guidance and patient identification tools at blood-sampling booths are all controlled by the information system. The number of patient identification errors was greatly reduced by the system. However, identification errors have not been abolished in the ultrasound department. After re-evaluation of the patient identification process in this department, we recognized that the major reason for the errors came from excessive identification workflow. Ordinarily, an ultrasound test requires patient identification 3 times, because 3 different systems are required during the entire test process, i.e. ultrasound modality system, laboratory information system and a system for producing reports. We are trying to connect the 3 different systems to develop a one-time identification workflow, but it is not a simple task and has not been completed yet. Utilization of the laboratory information system is effective, but is not yet perfect for patient identification. The most fundamental procedure for patient identification is to ask a person's name even today. Everyday checks in the ordinary workflow and everyone's participation in safety-management activity are important for the prevention of patient identification errors.

  4. WE-H-207B-03: MRI Guidance in the Radiation Therapy Clinic: Site-Specific Discussions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C.

    2016-06-15

    In recent years, steady progress has been made towards the implementation of MRI in external beam radiation therapy for processes ranging from treatment simulation to in-room guidance. Novel procedures relying mostly on MR data are currently implemented in the clinic. This session will cover topics such as (a) commissioning and quality control of the MR in-room imagers and simulators specific to RT, (b) treatment planning requirements, constraints and challenges when dealing with various MR data, (c) quantification of organ motion with an emphasis on treatment delivery guidance, and (d) MR-driven strategies for adaptive RT workflows. The content of the sessionmore » was chosen to address both educational and practical key aspects of MR guidance. Learning Objectives: Good understanding of MR testing recommended for in-room MR imaging as well as image data validation for RT chain (e.g. image transfer, filtering for consistency, spatial accuracy, manipulation for task specific); Familiarity with MR-based planning procedures: motivation, core workflow requirements, current status, challenges; Overview of the current methods for the quantification of organ motion; Discussion on approaches for adaptive treatment planning and delivery. T. Stanescu - License agreement with Modus Medical Devices to develop a phantom for the quantification of MR image system-related distortions.; T. Stanescu, N/A.« less

  5. The radiologist's workflow environment: evaluation of disruptors and potential implications.

    PubMed

    Yu, John-Paul J; Kansagra, Akash P; Mongan, John

    2014-06-01

    Workflow interruptions in the health care delivery environment are a major contributor to medical errors and have been extensively studied within numerous hospital settings, including the nursing environment and the operating room, along with their effects on physician workflow. Less understood, though, is the role of interruptions in other highly specialized clinical domains and subspecialty services, such as diagnostic radiology. The workflow of the on-call radiologist, in particular, is especially susceptible to disruption by telephone calls and other modes of physician-to-physician communication. Herein, the authors describe their initial efforts to quantify the degree of interruption experienced by on-call radiologists and examine its potential implications in patient safety and overall clinical care. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  6. Automation in an Addiction Treatment Research Clinic: Computerized Contingency Management, Ecological Momentary Assessment, and a Protocol Workflow System

    PubMed Central

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H.; Preston, Kenzie L.

    2009-01-01

    Issues A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients’ treatment needs and accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with provision of seamless methods for exporting, mining, and querying the data. Approach We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialized applications: the Automated Contingency Management (ACM) system for delivery of behavioral interventions, the Transactional Electronic Diary (TED) system for management of behavioral assessments, and the Protocol Workflow System (PWS) for computerized workflow automation and guidance of each participant’s daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorized staff. Key Findings ACM and TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80-patient capacity having an annual average of 18,000 patient-visits and 7,300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarize participant-safety data for research oversight. Implications and conclusion When developed in consultation with end users, automation in treatment-research clinics can enable more efficient operations, better communication among staff, and expansions in research methods. PMID:19320669

  7. WE-H-207B-00: MRgRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    In recent years, steady progress has been made towards the implementation of MRI in external beam radiation therapy for processes ranging from treatment simulation to in-room guidance. Novel procedures relying mostly on MR data are currently implemented in the clinic. This session will cover topics such as (a) commissioning and quality control of the MR in-room imagers and simulators specific to RT, (b) treatment planning requirements, constraints and challenges when dealing with various MR data, (c) quantification of organ motion with an emphasis on treatment delivery guidance, and (d) MR-driven strategies for adaptive RT workflows. The content of the sessionmore » was chosen to address both educational and practical key aspects of MR guidance. Learning Objectives: Good understanding of MR testing recommended for in-room MR imaging as well as image data validation for RT chain (e.g. image transfer, filtering for consistency, spatial accuracy, manipulation for task specific); Familiarity with MR-based planning procedures: motivation, core workflow requirements, current status, challenges; Overview of the current methods for the quantification of organ motion; Discussion on approaches for adaptive treatment planning and delivery. T. Stanescu - License agreement with Modus Medical Devices to develop a phantom for the quantification of MR image system-related distortions.; T. Stanescu, N/A.« less

  8. WE-H-207B-04: Strategies for Adaptive RT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, O.

    2016-06-15

    In recent years, steady progress has been made towards the implementation of MRI in external beam radiation therapy for processes ranging from treatment simulation to in-room guidance. Novel procedures relying mostly on MR data are currently implemented in the clinic. This session will cover topics such as (a) commissioning and quality control of the MR in-room imagers and simulators specific to RT, (b) treatment planning requirements, constraints and challenges when dealing with various MR data, (c) quantification of organ motion with an emphasis on treatment delivery guidance, and (d) MR-driven strategies for adaptive RT workflows. The content of the sessionmore » was chosen to address both educational and practical key aspects of MR guidance. Learning Objectives: Good understanding of MR testing recommended for in-room MR imaging as well as image data validation for RT chain (e.g. image transfer, filtering for consistency, spatial accuracy, manipulation for task specific); Familiarity with MR-based planning procedures: motivation, core workflow requirements, current status, challenges; Overview of the current methods for the quantification of organ motion; Discussion on approaches for adaptive treatment planning and delivery. T. Stanescu - License agreement with Modus Medical Devices to develop a phantom for the quantification of MR image system-related distortions.; T. Stanescu, N/A.« less

  9. Clinic Workflow Simulations using Secondary EHR Data

    PubMed Central

    Hribar, Michelle R.; Biermann, David; Read-Brown, Sarah; Reznick, Leah; Lombardi, Lorinna; Parikh, Mansi; Chamberlain, Winston; Yackel, Thomas R.; Chiang, Michael F.

    2016-01-01

    Clinicians today face increased patient loads, decreased reimbursements and potential negative productivity impacts of using electronic health records (EHR), but have little guidance on how to improve clinic efficiency. Discrete event simulation models are powerful tools for evaluating clinical workflow and improving efficiency, particularly when they are built from secondary EHR timing data. The purpose of this study is to demonstrate that these simulation models can be used for resource allocation decision making as well as for evaluating novel scheduling strategies in outpatient ophthalmology clinics. Key findings from this study are that: 1) secondary use of EHR timestamp data in simulation models represents clinic workflow, 2) simulations provide insight into the best allocation of resources in a clinic, 3) simulations provide critical information for schedule creation and decision making by clinic managers, and 4) simulation models built from EHR data are potentially generalizable. PMID:28269861

  10. Data Curation: Improving Environmental Health Data Quality.

    PubMed

    Yang, Lin; Li, Jiao; Hou, Li; Qian, Qing

    2015-01-01

    With the growing recognition of the influence of climate change on human health, scientists' attention to analyzing the relationship between meteorological factors and adverse health effects. However, the paucity of high quality integrated data is one of the great challenges, especially when scientific studies rely on data-intensive computing. This paper aims to design an appropriate curation process to address this problem. We present a data curation workflow that: (i) follows the guidance of DCC Curation Lifecycle Model; (ii) combines manual curation with automatic curation; (iii) and solves environmental health data curation problem. The workflow was applied to a medical knowledge service system and showed that it was capable of improving work efficiency and data quality.

  11. Ergonomic design for dental offices.

    PubMed

    Ahearn, David J; Sanders, Martha J; Turcotte, Claudia

    2010-01-01

    The increasing complexity of the dental office environment influences productivity and workflow for dental clinicians. Advances in technology, and with it the range of products needed to provide services, have led to sprawl in operatory setups and the potential for awkward postures for dental clinicians during the delivery of oral health services. Although ergonomics often addresses the prevention of musculoskeletal disorders for specific populations of workers, concepts of workflow and productivity are integral to improved practice in work environments. This article provides suggestions for improving workflow and productivity for dental clinicians. The article applies ergonomic principles to dental practice issues such as equipment and supply management, office design, and workflow management. Implications for improved ergonomic processes and future research are explored.

  12. TU-A-201-00: Image Guidance Technologies and Management Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    Recent years have seen a widespread proliferation of available in-room image guidance systems for radiation therapy target localization with many centers having multiple in-room options. In this session, available imaging systems for in-room IGRT will be reviewed highlighting the main differences in workflow efficiency, targeting accuracy and image quality as it relates to target visualization. Decision-making strategies for integrating these tools into clinical image guidance protocols that are tailored to specific disease sites like H&N, lung, pelvis, and spine SBRT will be discussed. Learning Objectives: Major system characteristics of a wide range of available in-room imaging systems for IGRT. Advantagesmore » / disadvantages of different systems for site-specific IGRT considerations. Concepts of targeting accuracy and time efficiency in designing clinical imaging protocols.« less

  13. Is Career Guidance for the Individual or for the Market? Implications of EU Policy for Career Guidance

    ERIC Educational Resources Information Center

    Bergmo-Prvulovic, Ingela

    2014-01-01

    This paper explores the essential understanding and underlying perspectives of career implicit in EU career guidance policy in the twenty-first century, as well as the possible implications of these for the future mission of guidance. Career theories, models and concepts that serve career guidance are shaped on the twentieth-century industrial…

  14. Announcement: Guidance for U.S. Laboratory Testing for Zika Virus Infection: Implications for Health Care Providers.

    PubMed

    2016-11-25

    CDC has released updated guidance online for U.S. laboratory testing for Zika virus infection. The guidance is available at https://www.cdc.gov/zika/laboratories/lab-guidance.html. Frequently asked questions are addressed at https://www.cdc.gov/zika/laboratories/lab-guidance-faq.html. This guidance updates recommendations for testing of specimens by U.S. laboratories for possible Zika virus infection. Major updates to the guidance with clinical implications for health care providers include the following.

  15. 2D–3D radiograph to cone-beam computed tomography (CBCT) registration for C-arm image-guided robotic surgery

    PubMed Central

    Liu, Wen Pei; Otake, Yoshito; Azizian, Mahdi; Wagner, Oliver J.; Sorger, Jonathan M.; Armand, Mehran; Taylor, Russell H.

    2015-01-01

    Purpose C-arm radiographs are commonly used for intraoperative image guidance in surgical interventions. Fluoroscopy is a cost-effective real-time modality, although image quality can vary greatly depending on the target anatomy. Cone-beam computed tomography (CBCT) scans are sometimes available, so 2D–3D registration is needed for intra-procedural guidance. C-arm radiographs were registered to CBCT scans and used for 3D localization of peritumor fiducials during a minimally invasive thoracic intervention with a da Vinci Si robot. Methods Intensity-based 2D–3D registration of intraoperative radiographs to CBCT was performed. The feasible range of X-ray projections achievable by a C-arm positioned around a da Vinci Si surgical robot, configured for robotic wedge resection, was determined using phantom models. Experiments were conducted on synthetic phantoms and animals imaged with an OEC 9600 and a Siemens Artis zeego, representing the spectrum of different C-arm systems currently available for clinical use. Results The image guidance workflow was feasible using either an optically tracked OEC 9600 or a Siemens Artis zeego C-arm, resulting in an angular difference of Δθ : ~ 30°. The two C-arm systems provided TREmean ≤ 2.5 mm and TREmean ≤ 2.0 mm, respectively (i.e., comparable to standard clinical intraoperative navigation systems). Conclusions C-arm 3D localization from dual 2D–3D registered radiographs was feasible and applicable for intraoperative image guidance during da Vinci robotic thoracic interventions using the proposed workflow. Tissue deformation and in vivo experiments are required before clinical evaluation of this system. PMID:25503592

  16. Navigation concepts for magnetic resonance imaging-guided musculoskeletal interventions.

    PubMed

    Busse, Harald; Kahn, Thomas; Moche, Michael

    2011-08-01

    Image-guided musculoskeletal (MSK) interventions are a widely used alternative to open surgical procedures for various pathological findings in different body regions. They traditionally involve one of the established x-ray imaging techniques (radiography, fluoroscopy, computed tomography) or ultrasound scanning. Over the last decades, magnetic resonance imaging (MRI) has evolved into one of the most powerful diagnostic tools for nearly the whole body and has therefore been increasingly considered for interventional guidance as well.The strength of MRI for MSK applications is a combination of well-known general advantages, such as multiplanar and functional imaging capabilities, wide choice of tissue contrasts, and absence of ionizing radiation, as well as a number of MSK-specific factors, for example, the excellent depiction of soft-tissue tumors, nonosteolytic bone changes, and bone marrow lesions. On the downside, the magnetic resonance-compatible equipment needed, restricted space in the magnet, longer imaging times, and the more complex workflow have so far limited the number of MSK procedures under MRI guidance.Navigation solutions are generally a natural extension of any interventional imaging system, in particular, because powerful hardware and software for image processing have become routinely available. They help to identify proper access paths, provide accurate feedback on the instrument positions, facilitate the workflow in an MRI environment, and ultimately contribute to procedural safety and success.The purposes of this work were to describe some basic concepts and devices for MRI guidance of MSK procedures and to discuss technical and clinical achievements and challenges for some selected implementations.

  17. Improving Synthetic Biology Communication: Recommended Practices for Visual Depiction and Digital Submission of Genetic Designs.

    PubMed

    Hillson, Nathan J; Plahar, Hector A; Beal, Jacob; Prithviraj, Ranjini

    2016-06-17

    Research is communicated more effectively and reproducibly when articles depict genetic designs consistently and fully disclose the complete sequences of all reported constructs. ACS Synthetic Biology is now providing authors with updated guidance and piloting a new tool and publication workflow that facilitate compliance with these recommended practices and standards for visual representation and data exchange.

  18. Ultrasound Imaging in Radiation Therapy: From Interfractional to Intrafractional Guidance

    PubMed Central

    Western, Craig; Hristov, Dimitre

    2015-01-01

    External beam radiation therapy (EBRT) is included in the treatment regimen of the majority of cancer patients. With the proliferation of hypofractionated radiotherapy treatment regimens, such as stereotactic body radiation therapy (SBRT), interfractional and intrafractional imaging technologies are becoming increasingly critical to ensure safe and effective treatment delivery. Ultrasound (US)-based image guidance systems offer real-time, markerless, volumetric imaging with excellent soft tissue contrast, overcoming the limitations of traditional X-ray or computed tomography (CT)-based guidance for abdominal and pelvic cancer sites, such as the liver and prostate. Interfractional US guidance systems have been commercially adopted for patient positioning but suffer from systematic positioning errors induced by probe pressure. More recently, several research groups have introduced concepts for intrafractional US guidance systems leveraging robotic probe placement technology and real-time soft tissue tracking software. This paper reviews various commercial and research-level US guidance systems used in radiation therapy, with an emphasis on hardware and software technologies that enable the deployment of US imaging within the radiotherapy environment and workflow. Previously unpublished material on tissue tracking systems and robotic probe manipulators under development by our group is also included. PMID:26180704

  19. Quantifying nursing workflow in medication administration.

    PubMed

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  20. Advanced approach for intraoperative MRI guidance and potential benefit for neurosurgical applications.

    PubMed

    Busse, Harald; Schmitgen, Arno; Trantakis, Christos; Schober, Ralf; Kahn, Thomas; Moche, Michael

    2006-07-01

    To present an advanced approach for intraoperative image guidance in an open 0.5 T MRI and to evaluate its effectiveness for neurosurgical interventions by comparison with a dynamic scan-guided localization technique. The built-in scan guidance mode relied on successive interactive MRI scans. The additional advanced mode provided real-time navigation based on reformatted high-quality, intraoperatively acquired MR reference data, allowed multimodal image fusion, and used the successive scans of the built-in mode for quick verification of the position only. Analysis involved tumor resections and biopsies in either scan guidance (N = 36) or advanced mode (N = 59) by the same three neurosurgeons. Technical, surgical, and workflow aspects were compared. The image quality and hand-eye coordination of the advanced approach were improved. While the average extent of resection, neurologic outcome after functional MRI (fMRI) integration, and diagnostic yield appeared to be slightly better under advanced guidance, particularly for the main surgeon, statistical analysis revealed no significant differences. Resection times were comparable, while biopsies took around 30 minutes longer. The presented approach is safe and provides more detailed images and higher navigation speed at the expense of actuality. The surgical outcome achieved with advanced guidance is (at least) as good as that obtained with dynamic scan guidance. (c) 2006 Wiley-Liss, Inc.

  1. Navigated MRI-guided liver biopsies in a closed-bore scanner: experience in 52 patients.

    PubMed

    Moche, Michael; Heinig, Susann; Garnov, Nikita; Fuchs, Jochen; Petersen, Tim-Ole; Seider, Daniel; Brandmaier, Philipp; Kahn, Thomas; Busse, Harald

    2016-08-01

    To evaluate clinical effectiveness and diagnostic efficiency of a navigation device for MR-guided biopsies of focal liver lesions in a closed-bore scanner. In 52 patients, 55 biopsies were performed. An add-on MR navigation system with optical instrument tracking was used for image guidance and biopsy device insertion outside the bore. Fast control imaging allowed visualization of the true needle position at any time. The biopsy workflow and procedure duration were recorded. Histological analysis and clinical course/outcome were used to calculate sensitivity, specificity and diagnostic accuracy. Fifty-four of 55 liver biopsies were performed successfully with the system. No major and four minor complications occurred. Mean tumour size was 23 ± 14 mm and the skin-to-target length ranged from 22 to 177 mm. In 39 cases, access path was double oblique. Sensitivity, specificity and diagnostic accuracy were 88 %, 100 % and 92 %, respectively. The mean procedure time was 51 ± 12 min, whereas the puncture itself lasted 16 ± 6 min. On average, four control scans were taken. Using this navigation device, biopsies of poorly visible and difficult accessible liver lesions could be performed safely and reliably in a closed-bore MRI scanner. The system can be easily implemented in clinical routine workflow. • Targeted liver biopsies could be reliably performed in a closed-bore MRI. • The navigation system allows for image guidance outside of the scanner bore. • Assisted MRI-guided biopsies are helpful for focal lesions with a difficult access. • Successful integration of the method in clinical workflow was shown. • Subsequent system installation in an existing MRI environment is feasible.

  2. Preservation of protein fluorescence in embedded human dendritic cells for targeted 3D light and electron microscopy

    PubMed Central

    HÖHN, K.; FUCHS, J.; FRÖBER, A.; KIRMSE, R.; GLASS, B.; ANDERS‐ÖSSWEIN, M.; WALTHER, P.; KRÄUSSLICH, H.‐G.

    2015-01-01

    Summary In this study, we present a correlative microscopy workflow to combine detailed 3D fluorescence light microscopy data with ultrastructural information gained by 3D focused ion beam assisted scanning electron microscopy. The workflow is based on an optimized high pressure freezing/freeze substitution protocol that preserves good ultrastructural detail along with retaining the fluorescence signal in the resin embedded specimens. Consequently, cellular structures of interest can readily be identified and imaged by state of the art 3D confocal fluorescence microscopy and are precisely referenced with respect to an imprinted coordinate system on the surface of the resin block. This allows precise guidance of the focused ion beam assisted scanning electron microscopy and limits the volume to be imaged to the structure of interest. This, in turn, minimizes the total acquisition time necessary to conduct the time consuming ultrastructural scanning electron microscope imaging while eliminating the risk to miss parts of the target structure. We illustrate the value of this workflow for targeting virus compartments, which are formed in HIV‐pulsed mature human dendritic cells. PMID:25786567

  3. qF-SSOP: real-time optical property corrected fluorescence imaging

    PubMed Central

    Valdes, Pablo A.; Angelo, Joseph P.; Choi, Hak Soo; Gioux, Sylvain

    2017-01-01

    Fluorescence imaging is well suited to provide image guidance during resections in oncologic and vascular surgery. However, the distorting effects of tissue optical properties on the emitted fluorescence are poorly compensated for on even the most advanced fluorescence image guidance systems, leading to subjective and inaccurate estimates of tissue fluorophore concentrations. Here we present a novel fluorescence imaging technique that performs real-time (i.e., video rate) optical property corrected fluorescence imaging. We perform full field of view simultaneous imaging of tissue optical properties using Single Snapshot of Optical Properties (SSOP) and fluorescence detection. The estimated optical properties are used to correct the emitted fluorescence with a quantitative fluorescence model to provide quantitative fluorescence-Single Snapshot of Optical Properties (qF-SSOP) images with less than 5% error. The technique is rigorous, fast, and quantitative, enabling ease of integration into the surgical workflow with the potential to improve molecular guidance intraoperatively. PMID:28856038

  4. Historical Development of Guidance and Counseling and Implications for the Future

    ERIC Educational Resources Information Center

    Aubrey, Roger F.

    1977-01-01

    The author traces the development of guidance and counseling from the nineteenth century to the present with implications for the future. The impact of the progressive movement, vocational guidance, industrialization, psychometrics, and Carl Rogers are highlighted. The 1950's are singled out as the decade having the greatest effect on counselors.…

  5. System Integration and In Vivo Testing of a Robot for Ultrasound Guidance and Monitoring During Radiotherapy.

    PubMed

    Sen, Hasan Tutkun; Bell, Muyinatu A Lediju; Zhang, Yin; Ding, Kai; Boctor, Emad; Wong, John; Iordachita, Iulian; Kazanzides, Peter

    2017-07-01

    We are developing a cooperatively controlled robot system for image-guided radiation therapy (IGRT) in which a clinician and robot share control of a 3-D ultrasound (US) probe. IGRT involves two main steps: 1) planning/simulation and 2) treatment delivery. The goals of the system are to provide guidance for patient setup and real-time target monitoring during fractionated radiotherapy of soft tissue targets, especially in the upper abdomen. To compensate for soft tissue deformations created by the probe, we present a novel workflow where the robot holds the US probe on the patient during acquisition of the planning computerized tomography image, thereby ensuring that planning is performed on the deformed tissue. The robot system introduces constraints (virtual fixtures) to help to produce consistent soft tissue deformation between simulation and treatment days, based on the robot position, contact force, and reference US image recorded during simulation. This paper presents the system integration and the proposed clinical workflow, validated by an in vivo canine study. The results show that the virtual fixtures enable the clinician to deviate from the recorded position to better reproduce the reference US image, which correlates with more consistent soft tissue deformation and the possibility for more accurate patient setup and radiation delivery.

  6. Practice patterns of image guided particle therapy in Europe: A 2016 survey of the European Particle Therapy Network (EPTN).

    PubMed

    Bolsi, Alessandra; Peroni, Marta; Amelio, Dante; Dasu, Alexandru; Stock, Markus; Toma-Dasu, Iuliana; Nyström, Petra Witt; Hoffmann, Aswin

    2018-03-28

    Image guidance is critical in achieving accurate and precise radiation delivery in particle therapy, even more than in photon therapy. However, equipment, quality assurance procedures and clinical workflows for image-guided particle therapy (IGPT) may vary substantially between centres due to a lack of standardization. A survey was conducted to evaluate the current practice of IGPT in European particle therapy centres. In 2016, a questionnaire was distributed among 19 particle therapy centres in 12 European countries. The questionnaire consisted of 30 open and 37 closed questions related to image guidance in the general clinical workflow, for moving targets, current research activities and future perspectives of IGPT. All centres completed the questionnaire. The IGPT methods used by the 10 treating centres varied substantially. The 9 non-treating centres were in the process to introduce IGPT. Most centres have developed their own IGPT strategies, being tightly connected to their specific technical implementation and dose delivery methods. Insight into the current clinical practice of IGPT in European particle therapy centres was obtained. A variety in IGPT practices and procedures was confirmed, which underlines the need for harmonisation of practice parameters and consensus guidelines. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. The Settings, Pros and Cons of the New Surgical Robot da Vinci Xi System for Transoral Robotic Surgery (TORS): A Comparison With the Popular da Vinci Si System.

    PubMed

    Kim, Da Hee; Kim, Hwan; Kwak, Sanghyun; Baek, Kwangha; Na, Gina; Kim, Ji Hoon; Kim, Se Heon

    2016-10-01

    The da Vinci system (da Vinci Surgical System; Intuitive Surgical Inc.) has rapidly developed in several years from the S system to the Si system and now the Xi System. To investigate the surgical feasibility and to provide workflow guidance for the newly released system, we used the new da Vinci Xi system for transoral robotic surgery (TORS) on a cadaveric specimen. Bilateral supraglottic partial laryngectomy, hypopharyngectomy, lateral oropharyngectomy, and base of the tongue resection were serially performed in search of the optimal procedures with the new system. The new surgical robotic system has been upgraded in all respects. The telescope and camera were incorporated into one system, with a digital end-mounted camera. Overhead boom rotation allows multiquadrant access without axis limitation, the arms are now thinner and longer with grabbing movements for easy adjustments. The patient clearance button dramatically reduces external collisions. The new surgical robotic system has been optimized for improved anatomic access, with better-equipped appurtenances. This cadaveric study of TORS offers guidance on the best protocol for surgical workflow with the new Xi system leading to improvements in the functional results of TORS.

  8. Planning, guidance, and quality assurance of pelvic screw placement using deformable image registration

    NASA Astrophysics Data System (ADS)

    Goerres, J.; Uneri, A.; Jacobson, M.; Ramsay, B.; De Silva, T.; Ketcha, M.; Han, R.; Manbachi, A.; Vogt, S.; Kleinszig, G.; Wolinsky, J.-P.; Osgood, G.; Siewerdsen, J. H.

    2017-12-01

    Percutaneous pelvic screw placement is challenging due to narrow bone corridors surrounded by vulnerable structures and difficult visual interpretation of complex anatomical shapes in 2D x-ray projection images. To address these challenges, a system for planning, guidance, and quality assurance (QA) is presented, providing functionality analogous to surgical navigation, but based on robust 3D-2D image registration techniques using fluoroscopy images already acquired in routine workflow. Two novel aspects of the system are investigated: automatic planning of pelvic screw trajectories and the ability to account for deformation of surgical devices (K-wire deflection). Atlas-based registration is used to calculate a patient-specific plan of screw trajectories in preoperative CT. 3D-2D registration aligns the patient to CT within the projective geometry of intraoperative fluoroscopy. Deformable known-component registration (dKC-Reg) localizes the surgical device, and the combination of plan and device location is used to provide guidance and QA. A leave-one-out analysis evaluated the accuracy of automatic planning, and a cadaver experiment compared the accuracy of dKC-Reg to rigid approaches (e.g. optical tracking). Surgical plans conformed within the bone cortex by 3-4 mm for the narrowest corridor (superior pubic ramus) and  >5 mm for the widest corridor (tear drop). The dKC-Reg algorithm localized the K-wire tip within 1.1 mm and 1.4° and was consistently more accurate than rigid-body tracking (errors up to 9 mm). The system was shown to automatically compute reliable screw trajectories and accurately localize deformed surgical devices (K-wires). Such capability could improve guidance and QA in orthopaedic surgery, where workflow is impeded by manual planning, conventional tool trackers add complexity and cost, rigid tool assumptions are often inaccurate, and qualitative interpretation of complex anatomy from 2D projections is prone to trial-and-error with extended fluoroscopy time.

  9. Review of ultrasound image guidance in external beam radiotherapy: I. Treatment planning and inter-fraction motion management

    NASA Astrophysics Data System (ADS)

    Fontanarosa, Davide; van der Meer, Skadi; Bamber, Jeffrey; Harris, Emma; O'Shea, Tuathan; Verhaegen, Frank

    2015-02-01

    In modern radiotherapy, verification of the treatment to ensure the target receives the prescribed dose and normal tissues are optimally spared has become essential. Several forms of image guidance are available for this purpose. The most commonly used forms of image guidance are based on kilovolt or megavolt x-ray imaging. Image guidance can also be performed with non-harmful ultrasound (US) waves. This increasingly used technique has the potential to offer both anatomical and functional information. This review presents an overview of the historical and current use of two-dimensional and three-dimensional US imaging for treatment verification in radiotherapy. The US technology and the implementation in the radiotherapy workflow are described. The use of US guidance in the treatment planning process is discussed. The role of US technology in inter-fraction motion monitoring and management is explained, and clinical studies of applications in areas such as the pelvis, abdomen and breast are reviewed. A companion review paper (O’Shea et al 2015 Phys. Med. Biol. submitted) will extensively discuss the use of US imaging for intra-fraction motion quantification and novel applications of US technology to RT.

  10. Review of ultrasound image guidance in external beam radiotherapy: I. Treatment planning and inter-fraction motion management.

    PubMed

    Fontanarosa, Davide; van der Meer, Skadi; Bamber, Jeffrey; Harris, Emma; O'Shea, Tuathan; Verhaegen, Frank

    2015-02-07

    In modern radiotherapy, verification of the treatment to ensure the target receives the prescribed dose and normal tissues are optimally spared has become essential. Several forms of image guidance are available for this purpose. The most commonly used forms of image guidance are based on kilovolt or megavolt x-ray imaging. Image guidance can also be performed with non-harmful ultrasound (US) waves. This increasingly used technique has the potential to offer both anatomical and functional information.This review presents an overview of the historical and current use of two-dimensional and three-dimensional US imaging for treatment verification in radiotherapy. The US technology and the implementation in the radiotherapy workflow are described. The use of US guidance in the treatment planning process is discussed. The role of US technology in inter-fraction motion monitoring and management is explained, and clinical studies of applications in areas such as the pelvis, abdomen and breast are reviewed. A companion review paper (O'Shea et al 2015 Phys. Med. Biol. submitted) will extensively discuss the use of US imaging for intra-fraction motion quantification and novel applications of US technology to RT.

  11. Enhancing population pharmacokinetic modeling efficiency and quality using an integrated workflow.

    PubMed

    Schmidt, Henning; Radivojevic, Andrijana

    2014-08-01

    Population pharmacokinetic (popPK) analyses are at the core of Pharmacometrics and need to be performed regularly. Although these analyses are relatively standard, a large variability can be observed in both the time (efficiency) and the way they are performed (quality). Main reasons for this variability include the level of experience of a modeler, personal preferences and tools. This paper aims to examine how the process of popPK model building can be supported in order to increase its efficiency and quality. The presented approach to the conduct of popPK analyses is centered around three key components: (1) identification of most common and important popPK model features, (2) required information content and formatting of the data for modeling, and (3) methodology, workflow and workflow supporting tools. This approach has been used in several popPK modeling projects and a documented example is provided in the supplementary material. Efficiency of model building is improved by avoiding repetitive coding and other labor-intensive tasks and by putting the emphasis on a fit-for-purpose model. Quality is improved by ensuring that the workflow and tools are in alignment with a popPK modeling guidance which is established within an organization. The main conclusion of this paper is that workflow based approaches to popPK modeling are feasible and have significant potential to ameliorate its various aspects. However, the implementation of such an approach in a pharmacometric organization requires openness towards innovation and change-the key ingredient for evolution of integrative and quantitative drug development in the pharmaceutical industry.

  12. Ultrasound-Based Guidance for Partial Breast Irradiation Therapy

    DTIC Science & Technology

    2011-01-01

    and also are inexpensive. b. Collect US data from patient before the PBI treatment at the same time that CT is collected (months 2-14). We...introduces minimal divergence from the original workflow of PBI treatment. We have an approved institutional review board (IRB) protocol to obtain B...irradiation of only the in- volved area of the breast, partial breast irradiation ( PBI ), is as effective as whole breast irradiation [1]. Benefits of PBI

  13. Preservation of protein fluorescence in embedded human dendritic cells for targeted 3D light and electron microscopy.

    PubMed

    Höhn, K; Fuchs, J; Fröber, A; Kirmse, R; Glass, B; Anders-Össwein, M; Walther, P; Kräusslich, H-G; Dietrich, C

    2015-08-01

    In this study, we present a correlative microscopy workflow to combine detailed 3D fluorescence light microscopy data with ultrastructural information gained by 3D focused ion beam assisted scanning electron microscopy. The workflow is based on an optimized high pressure freezing/freeze substitution protocol that preserves good ultrastructural detail along with retaining the fluorescence signal in the resin embedded specimens. Consequently, cellular structures of interest can readily be identified and imaged by state of the art 3D confocal fluorescence microscopy and are precisely referenced with respect to an imprinted coordinate system on the surface of the resin block. This allows precise guidance of the focused ion beam assisted scanning electron microscopy and limits the volume to be imaged to the structure of interest. This, in turn, minimizes the total acquisition time necessary to conduct the time consuming ultrastructural scanning electron microscope imaging while eliminating the risk to miss parts of the target structure. We illustrate the value of this workflow for targeting virus compartments, which are formed in HIV-pulsed mature human dendritic cells. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  14. FluxCTTX: A LIMS-based tool for management and analysis of cytotoxicity assays data

    PubMed Central

    2015-01-01

    Background Cytotoxicity assays have been used by researchers to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds or screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical. These assays may be used as an alternative to animal experimentation and are becoming increasingly important in modern laboratories. However, the execution of these assays in large scale and different laboratories requires, among other things, the management of protocols, reagents, cell lines used as well as the data produced, which can be a challenge. The management of all this information is greatly improved by the utilization of computational tools to save time and guarantee quality. However, a tool that performs this task designed specifically for cytotoxicity assays is not yet available. Results In this work, we have used a workflow based LIMS -- the Flux system -- and the Together Workflow Editor as a framework to develop FluxCTTX, a tool for management of data from cytotoxicity assays performed at different laboratories. The main work is the development of a workflow, which represents all stages of the assay and has been developed and uploaded in Flux. This workflow models the activities of cytotoxicity assays performed as described in the OECD 129 Guidance Document. Conclusions FluxCTTX presents a solution for the management of the data produced by cytotoxicity assays performed at Interlaboratory comparisons. Its adoption will contribute to guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of Good Laboratory Practices (GLP). Furthermore, the workflow developed is complete and can be adapted to other contexts and different tests for management of other types of data. PMID:26696462

  15. Preclinical Feasibility of a Technology Framework for MRI-guided Iliac Angioplasty

    PubMed Central

    Rube, Martin A.; Fernandez-Gutierrez, Fabiola; Cox, Benjamin F.; Holbrook, Andrew B.; Houston, J. Graeme; White, Richard D.; McLeod, Helen; Fatahi, Mahsa; Melzer, Andreas

    2015-01-01

    Purpose Interventional MRI has significant potential for image guidance of iliac angioplasty and related vascular procedures. A technology framework with in-room image display, control, communication and MRI-guided intervention techniques was designed and tested for its potential to provide safe, fast and efficient MRI-guided angioplasty of the iliac arteries. Methods A 1.5T MRI scanner was adapted for interactive imaging during endovascular procedures using new or modified interventional devices such as guidewires and catheters. A perfused vascular phantom was used for testing. Pre-, intra- and post-procedural visualization and measurement of vascular morphology and flow was implemented. A detailed analysis of X-Ray fluoroscopic angiography workflow was conducted and applied. Two interventional radiologists and one physician in training performed 39 procedures. All procedures were timed and analyzed. Results MRI-guided iliac angioplasty procedures were successfully performed with progressive adaptation of techniques and workflow. The workflow, setup and protocol enabled a reduction in table time for a dedicated MRI-guided procedure to 6 min 33 s with a mean procedure time of 9 min 2 s, comparable to the mean procedure time of 8 min 42 s for the standard X-Ray guided procedure. Conclusions MRI-guided iliac vascular interventions were found to be feasible and practical using this framework and optimized workflow. In particular the real-time flow analysis was found to be helpful for pre- and post-interventional assessments. Design optimization of the catheters and in vivo experiments are required before clinical evaluation. PMID:25102933

  16. TU-A-201-01: Introduction to In-Room Imaging System Characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, J.

    2016-06-15

    Recent years have seen a widespread proliferation of available in-room image guidance systems for radiation therapy target localization with many centers having multiple in-room options. In this session, available imaging systems for in-room IGRT will be reviewed highlighting the main differences in workflow efficiency, targeting accuracy and image quality as it relates to target visualization. Decision-making strategies for integrating these tools into clinical image guidance protocols that are tailored to specific disease sites like H&N, lung, pelvis, and spine SBRT will be discussed. Learning Objectives: Major system characteristics of a wide range of available in-room imaging systems for IGRT. Advantagesmore » / disadvantages of different systems for site-specific IGRT considerations. Concepts of targeting accuracy and time efficiency in designing clinical imaging protocols.« less

  17. TU-A-201-02: Treatment Site-Specific Considerations for Clinical IGRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijesooriya, K.

    2016-06-15

    Recent years have seen a widespread proliferation of available in-room image guidance systems for radiation therapy target localization with many centers having multiple in-room options. In this session, available imaging systems for in-room IGRT will be reviewed highlighting the main differences in workflow efficiency, targeting accuracy and image quality as it relates to target visualization. Decision-making strategies for integrating these tools into clinical image guidance protocols that are tailored to specific disease sites like H&N, lung, pelvis, and spine SBRT will be discussed. Learning Objectives: Major system characteristics of a wide range of available in-room imaging systems for IGRT. Advantagesmore » / disadvantages of different systems for site-specific IGRT considerations. Concepts of targeting accuracy and time efficiency in designing clinical imaging protocols.« less

  18. Hybrid surgical guidance based on the integration of radionuclear and optical technologies

    PubMed Central

    Valdés-Olmos, Renato; Buckle, Tessa; Vidal-Sicart, Sergi

    2016-01-01

    With the evolution of imaging technologies and tracers, the applications for nuclear molecular imaging are growing rapidly. For example, nuclear medicine is increasingly being used to guide surgical resections in complex anatomical locations. Here, a future workflow is envisioned that uses a combination of pre-operative diagnostics, navigation and intraoperative guidance. Radioguidance can provide means for pre-operative and intraoperative identification of “hot” lesions, forming the basis of a virtual data set that can be used for navigation. Luminescence guidance has shown great potential in the intraoperative setting by providing optical feedback, in some cases even in real time. Both of these techniques have distinct drawbacks, which include inaccuracy in areas that contain a background signal (radioactivity) or a limited degree of signal penetration (luminescence). We, and others, have reasoned that hybrid/multimodal approaches that integrate the use of these complementary modalities may help overcome their individual weaknesses. Ultimately, this will lead to advancement of the field of interventional molecular imaging/image-guided surgery. In this review, an overview of clinically applied hybrid surgical guidance technologies is given, whereby the focus is placed on tracers and hardware. PMID:26943463

  19. Childhood cancer survivor care: development of the Passport for Care.

    PubMed

    Poplack, David G; Fordis, Michael; Landier, Wendy; Bhatia, Smita; Hudson, Melissa M; Horowitz, Marc E

    2014-12-01

    Survivors of childhood cancer are at risk of long-term adverse effects and late effects of the disease and/or its treatment. In response to national recommendations to improve evidence-based follow-up care, a web-based support system for clinical decision making, the Passport for Care (PFC), was developed for use at the point of care to produce screening recommendations individualized to the survivor. To date, the PFC has been implemented in over half of the nearly 200 clinics affiliated with the Children's Oncology Group across the USA. Most clinician users report that the PFC has been integrated into clinic workflows, and that it fosters improved conversations with survivors about the potential late effects a survivor might experience and about the screening and/or behavioural interventions recommended to improve health status. Furthermore, clinicians using the PFC have indicated that they adhered more closely to follow-up care guidelines. Perspectives on the challenges encountered and lessons learned during the development and deployment of the PFC are reviewed and contrasted with other nationwide approaches to the provision of guidance on survivor follow-up care; furthermore, the implications for the care of childhood cancer survivors are discussed.

  20. Childhood cancer survivor care: development of the Passport for Care

    PubMed Central

    Poplack, David G.; Fordis, Michael; Landier, Wendy; Bhatia, Smita; Hudson, Melissa M.; Horowitz, Marc E.

    2016-01-01

    Survivors of childhood cancer are at risk of long-term adverse effects and late effects of the disease and/or its treatment. In response to national recommendations to improve evidence-based follow-up care, a web-based support system for clinical decision making, the Passport for Care (PFC), was developed for use at the point of care to produce screening recommendations individualized to the survivor. To date, the PFC has been implemented in over half of the nearly 200 clinics affiliated with the Children's Oncology Group across the USA. Most clinician users report that the PFC has been integrated into clinic workflows, and that it fosters improved conversations with survivors about the potential late effects a survivor might experience and about the screening and/or behavioural interventions recommended to improve health status. Furthermore, clinicians using the PFC have indicated that they adhered more closely to follow-up care guidelines. Perspectives on the challenges encountered and lessons learned during the development and deployment of the PFC are reviewed and contrasted with other nationwide approaches to the provision of guidance on survivor follow-up care; furthermore, the implications for the care of childhood cancer survivors are discussed. PMID:25348788

  1. Pervasive access to images and data--the use of computing grids and mobile/wireless devices across healthcare enterprises.

    PubMed

    Pohjonen, Hanna; Ross, Peeter; Blickman, Johan G; Kamman, Richard

    2007-01-01

    Emerging technologies are transforming the workflows in healthcare enterprises. Computing grids and handheld mobile/wireless devices are providing clinicians with enterprise-wide access to all patient data and analysis tools on a pervasive basis. In this paper, emerging technologies are presented that provide computing grids and streaming-based access to image and data management functions, and system architectures that enable pervasive computing on a cost-effective basis. Finally, the implications of such technologies are investigated regarding the positive impacts on clinical workflows.

  2. Toward Intraoperative Image-Guided Transoral Robotic Surgery

    PubMed Central

    Liu, Wen P.; Reaugamornrat, Sureerat; Deguet, Anton; Sorger, Jonathan M.; Siewerdsen, Jeffrey H.; Richmon, Jeremy; Taylor, Russell H.

    2014-01-01

    This paper presents the development and evaluation of video augmentation on the stereoscopic da Vinci S system with intraoperative image guidance for base of tongue tumor resection in transoral robotic surgery (TORS). Proposed workflow for image-guided TORS begins by identifying and segmenting critical oropharyngeal structures (e.g., the tumor and adjacent arteries and nerves) from preoperative computed tomography (CT) and/or magnetic resonance (MR) imaging. These preoperative planned data can be deformably registered to the intraoperative endoscopic view using mobile C-arm cone-beam computed tomography (CBCT) [1, 2]. Augmentation of TORS endoscopic video defining surgical targets and critical structures has the potential to improve navigation, spatial orientation, and confidence in tumor resection. Experiments in animal specimens achieved statistically significant improvement in target localization error when comparing the proposed image guidance system to simulated current practice. PMID:25525474

  3. Integration of Earth System Models and Workflow Management under iRODS for the Northeast Regional Earth System Modeling Project

    NASA Astrophysics Data System (ADS)

    Lengyel, F.; Yang, P.; Rosenzweig, B.; Vorosmarty, C. J.

    2012-12-01

    The Northeast Regional Earth System Model (NE-RESM, NSF Award #1049181) integrates weather research and forecasting models, terrestrial and aquatic ecosystem models, a water balance/transport model, and mesoscale and energy systems input-out economic models developed by interdisciplinary research team from academia and government with expertise in physics, biogeochemistry, engineering, energy, economics, and policy. NE-RESM is intended to forecast the implications of planning decisions on the region's environment, ecosystem services, energy systems and economy through the 21st century. Integration of model components and the development of cyberinfrastructure for interacting with the system is facilitated with the integrated Rule Oriented Data System (iRODS), a distributed data grid that provides archival storage with metadata facilities and a rule-based workflow engine for automating and auditing scientific workflows.

  4. Geomorphic process from topographic form: automating the interpretation of repeat survey data in river valleys

    USGS Publications Warehouse

    Kasprak, Alan; Caster, Joshua J.; Bangen, Sara G.; Sankey, Joel B.

    2017-01-01

    The ability to quantify the processes driving geomorphic change in river valley margins is vital to geomorphologists seeking to understand the relative role of transport mechanisms (e.g. fluvial, aeolian, and hillslope processes) in landscape dynamics. High-resolution, repeat topographic data are becoming readily available to geomorphologists. By contrasting digital elevation models derived from repeat surveys, the transport processes driving topographic changes can be inferred, a method termed ‘mechanistic segregation.’ Unfortunately, mechanistic segregation largely relies on subjective and time consuming manual classification, which has implications both for its reproducibility and the practical scale of its application. Here we present a novel computational workflow for the mechanistic segregation of geomorphic transport processes in geospatial datasets. We apply the workflow to seven sites along the Colorado River in the Grand Canyon, where geomorphic transport is driven by a diverse suite of mechanisms. The workflow performs well when compared to field observations, with an overall predictive accuracy of 84% across 113 validation points. The approach most accurately predicts changes due to fluvial processes (100% accuracy) and aeolian processes (96%), with reduced accuracy in predictions of alluvial and colluvial processes (64% and 73%, respectively). Our workflow is designed to be applicable to a diversity of river systems and will likely provide a rapid and objective understanding of the processes driving geomorphic change at the reach and network scales. We anticipate that such an understanding will allow insight into the response of geomorphic transport processes to external forcings, such as shifts in climate, land use, or river regulation, with implications for process-based river management and restoration.

  5. Connecting proteins with drug-like compounds: Open source drug discovery workflows with BindingDB and KNIME

    PubMed Central

    Berthold, Michael R.; Hedrick, Michael P.; Gilson, Michael K.

    2015-01-01

    Today’s large, public databases of protein–small molecule interaction data are creating important new opportunities for data mining and integration. At the same time, new graphical user interface-based workflow tools offer facile alternatives to custom scripting for informatics and data analysis. Here, we illustrate how the large protein-ligand database BindingDB may be incorporated into KNIME workflows as a step toward the integration of pharmacological data with broader biomolecular analyses. Thus, we describe a collection of KNIME workflows that access BindingDB data via RESTful webservices and, for more intensive queries, via a local distillation of the full BindingDB dataset. We focus in particular on the KNIME implementation of knowledge-based tools to generate informed hypotheses regarding protein targets of bioactive compounds, based on notions of chemical similarity. A number of variants of this basic approach are tested for seven existing drugs with relatively ill-defined therapeutic targets, leading to replication of some previously confirmed results and discovery of new, high-quality hits. Implications for future development are discussed. Database URL: www.bindingdb.org PMID:26384374

  6. Factors leading to overutilisation of hospital pathology testing: the junior doctor.

    PubMed

    Ericksson, William; Bothe, Janine; Cheung, Heidi; Zhang, Kate; Kelly, Simone

    2017-05-25

    Objective Pathology overutilisation is a significant issue affecting the quality and cost of health care. Because junior medical officers (JMOs) order most pathology tests in the hospital setting, the aim of the present study was to identify the main reasons for hospital pathology overutilisation from the perspective of the JMO. Methods A qualitative method, using focus group methodology, was undertaken. Sixteen JMOs from two hospitals participated in three focus groups. Data were analysed using thematic analysis. Results Three major themes contributed to overutilisation: the real and perceived expectations of senior colleagues, the level of JMO clinical experience and strategies to manage JMO workload around clinical systems. Within these themes, 12 subthemes were identified. Conclusions Overutilisation of hospital pathology testing occurs when there are high social costs to JMOs for underordering, with little cost for overordering. Interventions should restore this balance through reframing overutilisation as both a costly and potentially harmful activity, promoting a supportive culture with regular senior guidance, and addressing clinical systems in which missed tests create an excessive workload. What is known about the topic? Mean overutilisation rates of pathology testing are reported to be as high as 44%. Although numerous studies have reported successful efforts to decrease hospital pathology overutilisation, no primary research was identified that examined the JMO perspective on this subject. What does this paper add? Clinical need is not the primary factor guiding the pathology-ordering decisions of junior practitioners; rather, medical team culture, limited JMO experience and systems factors have a significant role. What are the implications for practitioners? The social and behavioural determinants of pathology ordering must be considered to achieve appropriate pathology test utilisation. These include senior medical officer engagement, the guidance of JMOs and clinical workflows.

  7. Modeling workflow to design machine translation applications for public health practice

    PubMed Central

    Turner, Anne M.; Brownstein, Megumu K.; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2014-01-01

    Objective Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). Materials and Methods We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. Results The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. Discussion This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. Counclusion The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. PMID:25445922

  8. The impact of missing sensor information on surgical workflow management.

    PubMed

    Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas

    2013-09-01

    Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.

  9. Streamlining Workflow for Endovascular Mechanical Thrombectomy: Lessons Learned from a Comprehensive Stroke Center.

    PubMed

    Wang, Hongjin; Thevathasan, Arthur; Dowling, Richard; Bush, Steven; Mitchell, Peter; Yan, Bernard

    2017-08-01

    Recently, 5 randomized controlled trials confirmed the superiority of endovascular mechanical thrombectomy (EMT) to intravenous thrombolysis in acute ischemic stroke with large-vessel occlusion. The implication is that our health systems would witness an increasing number of patients treated with EMT. However, in-hospital delays, leading to increased time to reperfusion, are associated with poor clinical outcomes. This review outlines the in-hospital workflow of the treatment of acute ischemic stroke at a comprehensive stroke center and the lessons learned in reduction of in-hospital delays. The in-hospital workflow for acute ischemic stroke was described from prehospital notification to femoral arterial puncture in preparation for EMT. Systematic review of literature was also performed with PubMed. The implementation of workflow streamlining could result in reduction of in-hospital time delays for patients who were eligible for EMT. In particular, time-critical measures, including prehospital notification, the transfer of patients from door to computed tomography (CT) room, initiation of intravenous thrombolysis in the CT room, and the mobilization of neurointervention team in parallel with thrombolysis, all contributed to reduction in time delays. We have identified issues resulting in in-hospital time delays and have reported possible solutions to improve workflow efficiencies. We believe that these measures may help stroke centers initiate an EMT service for eligible patients. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zoberi, J.

    Brachytherapy has proven to be an effective treatment option for prostate cancer. Initially, prostate brachytherapy was delivered through permanently implanted low dose rate (LDR) radioactive sources; however, high dose rate (HDR) temporary brachytherapy for prostate cancer is gaining popularity. Needle insertion during prostate brachytherapy is most commonly performed under ultrasound (U/S) guidance; however, treatment planning may be performed utilizing several imaging modalities either in an intra- or post-operative setting. During intra-operative prostate HDR, the needles are imaged during implantation, and planning may be performed in real time. At present, the most common imaging modality utilized for intra-operative prostate HDR ismore » U/S. Alternatively, in the post-operative setting, following needle implantation, patients may be simulated with computed tomography (CT) or magnetic resonance imaging (MRI). Each imaging modality and workflow provides its share of benefits and limitations. Prostate HDR has been adopted in a number of cancer centers across the nation. In this educational session, we will explore the role of U/S, CT, and MRI in HDR prostate brachytherapy. Example workflows and operational details will be shared, and we will discuss how to establish a prostate HDR program in a clinical setting. Learning Objectives: Review prostate HDR techniques based on the imaging modality Discuss the challenges and pitfalls introduced by the three imagebased options for prostate HDR brachytherapy Review the QA process and learn about the development of clinical workflows for these imaging options at different institutions.« less

  11. Wireless Mobile Technology to Improve Workflow and Feasibility of MR-Guided Percutaneous Interventions

    PubMed Central

    Rube, Martin A.; Holbrook, Andrew B.; Cox, Benjamin F.; Buciuc, Razvan; Melzer, Andreas

    2015-01-01

    Purpose A wireless interactive display and control device combined with a platform-independent web-based User Interface (UI) was developed to improve the workflow for interventional Magnetic Resonance Imaging (iMRI). Methods The iMRI-UI enables image acquisition of up to three independent slices using various pulse sequences with different contrast weighting. Pulse sequence, scan geometry and related parameters can be changed on the fly via the iMRI-UI using a tablet computer for improved lesion detection and interventional device targeting. The iMRI-UI was validated for core biopsies with a liver phantom (n=40) and Thiel soft-embalmed human cadavers (n=24) in a clinical 1.5T MRI scanner. Results The iMRI-UI components and setup were tested and found conditionally MRI-safe to use according to current ASTM standards. Despite minor temporary touchscreen interference at a close distance to the bore (<20 cm), no other issues regarding quality or imaging artefacts were observed. The 3D root-mean-square distance error was 2.8±1.0 (phantom) / 2.9±0.8 mm (cadaver) and overall procedure times ranged between 12–22 (phantom) / 20–55 minutes (cadaver). Conclusions The wireless iMRI-UI control setup enabled fast and accurate interventional biopsy needle placements along complex trajectories and improved the workflow for percutaneous interventions under MRI guidance in a preclinical trial. PMID:25179151

  12. Development of a high frequency single-element ultrasound needle transducer for anesthesia delivery

    NASA Astrophysics Data System (ADS)

    Ameri, Golafsoun; Son, Jungik; Liang, Jingwei; Foster, F. Stuart; Ganapathy, Sugantha; Peters, Terry M.

    2017-03-01

    Epidural anesthesia is one of the most commonly used and yet challenging techniques employed for pain management and anesthesia delivery. The major complications of this procedure are due to accidental dural puncture, with an incidence of 1-3%, which could lead to both temporary and irreversible permanent neurological complications. Needle placement under ultrasound (US) guidance has received increasing interest for improving needle placement accuracy. However, poor needle visibility in US, difficulties in displaying relevant anatomical structure such as dura mater due to attenuation and bone shadowing, and image interpretation variability among users pose significant hurdles for any US guidance system. As a result, US guidance for epidural injections has not been widely adopted for everyday use for the performance of neuraxial blocks. The difficulties in localizing the ligamentum flavum and dura with respect to the needle tip can be addressed by integrating A-mode US, provided by a single-element transducer at the needle tip, into the B-mode US guidance system. We have taken the first steps towards providing such a guidance system. Our goal is to improve the safety of this procedure with minimal changes to the clinical workflow. This work presents the design and development of a 20 MHz single-element US transducer housed at the tip of a 19 G needle hypodermic tube, which can fit inside an epidural introducer needle. In addition, the results from initial transducer characterization tests and performance evaluation of the transducer in a euthanized porcine model are provided.

  13. Getting Closer: Workplace Guidance for Lifelong Learning

    ERIC Educational Resources Information Center

    Plant, Peter; Turner, Bob

    2005-01-01

    The purposes of this article are twofold. First, it considers the policy links between guidance and lifelong learning, highlighting in particular the implications of findings from a recent study by the Organisation for Economic Co-operation and Development (OECD). Secondly, it critically compares two approaches to workplace guidance about…

  14. Career Guidance and Public Mental Health

    ERIC Educational Resources Information Center

    Robertson, Peter J.

    2013-01-01

    Career guidance may have the potential to promote public health by contributing positively to both the prevention of mental health conditions and to population level well-being. The policy implications of this possibility have received little attention. Career guidance agencies are well placed to reach key target groups. Producing persuasive…

  15. Percutaneous needle placement using laser guidance: a practical solution

    NASA Astrophysics Data System (ADS)

    Xu, Sheng; Kapoor, Ankur; Abi-Jaoudeh, Nadine; Imbesi, Kimberly; Hong, Cheng William; Mazilu, Dumitru; Sharma, Karun; Venkatesan, Aradhana M.; Levy, Elliot; Wood, Bradford J.

    2013-03-01

    In interventional radiology, various navigation technologies have emerged aiming to improve the accuracy of device deployment and potentially the clinical outcomes of minimally invasive procedures. While these technologies' performance has been explored extensively, their impact on daily clinical practice remains undetermined due to the additional cost and complexity, modification of standard devices (e.g. electromagnetic tracking), and different levels of experience among physicians. Taking these factors into consideration, a robotic laser guidance system for percutaneous needle placement is developed. The laser guidance system projects a laser guide line onto the skin entry point of the patient, helping the physician to align the needle with the planned path of the preoperative CT scan. To minimize changes to the standard workflow, the robot is integrated with the CT scanner via optical tracking. As a result, no registration between the robot and CT is needed. The robot can compensate for the motion of the equipment and keep the laser guide line aligned with the biopsy path in real-time. Phantom experiments showed that the guidance system can benefit physicians at different skill levels, while clinical studies showed improved accuracy over conventional freehand needle insertion. The technology is safe, easy to use, and does not involve additional disposable costs. It is our expectation that this technology can be accepted by interventional radiologists for CT guided needle placement procedures.

  16. A guide to processing bat acoustic data for the North American Bat Monitoring Program (NABat)

    USGS Publications Warehouse

    Reichert, Brian; Lausen, Cori; Loeb, Susan; Weller, Ted; Allen, Ryan; Britzke, Eric; Hohoff, Tara; Siemers, Jeremy; Burkholder, Braden; Herzog, Carl; Verant, Michelle

    2018-06-14

    The North American Bat Monitoring Program (NABat) aims to improve the state of conservation science for all species of bats shared by the United States, Canada, and Mexico. To accomplish this goal, NABat offers guidance and standardized protocols for acoustic monitoring of bats. In this document, “A Guide to Processing Bat Acoustic Data for the North American Bat Monitoring Program (NABat),” we provide general recommendations and specific workflows for the process of identifying bat species from acoustic files recorded using the NABat stationary point and mobile transect acoustic monitoring protocols.

  17. How I do it-optimizing radiofrequency ablation in spinal metastases using iCT and navigation.

    PubMed

    Kavakebi, Pujan; Freyschlag, C F; Thomé, C

    2017-10-01

    Exact positioning of the radiofrequency ablation (RFA) probe for tumor treatment under fluoroscopic guidance can be difficult because of potentially small inaccessible lesions and the radiation dose to the medical staff in RFA. In addition, vertebroplasty (VP) can be significantly high. Description and workflow of RFA in spinal metastasis using iCT (intraoperative computed tomography) and 3D-navigation-based probe placement followed by VP. RFA and VP can be successfully combined with iCT-based navigation, which leads to a reduction of radiation to the staff and optimal probe positioning due to 3D navigation.

  18. Cancer Diagnosis Epigenomics Scientific Workflow Scheduling in the Cloud Computing Environment Using an Improved PSO Algorithm

    PubMed

    N, Sadhasivam; R, Balamurugan; M, Pandi

    2018-01-27

    Objective: Epigenetic modifications involving DNA methylation and histone statud are responsible for the stable maintenance of cellular phenotypes. Abnormalities may be causally involved in cancer development and therefore could have diagnostic potential. The field of epigenomics refers to all epigenetic modifications implicated in control of gene expression, with a focus on better understanding of human biology in both normal and pathological states. Epigenomics scientific workflow is essentially a data processing pipeline to automate the execution of various genome sequencing operations or tasks. Cloud platform is a popular computing platform for deploying large scale epigenomics scientific workflow. Its dynamic environment provides various resources to scientific users on a pay-per-use billing model. Scheduling epigenomics scientific workflow tasks is a complicated problem in cloud platform. We here focused on application of an improved particle swam optimization (IPSO) algorithm for this purpose. Methods: The IPSO algorithm was applied to find suitable resources and allocate epigenomics tasks so that the total cost was minimized for detection of epigenetic abnormalities of potential application for cancer diagnosis. Result: The results showed that IPSO based task to resource mapping reduced total cost by 6.83 percent as compared to the traditional PSO algorithm. Conclusion: The results for various cancer diagnosis tasks showed that IPSO based task to resource mapping can achieve better costs when compared to PSO based mapping for epigenomics scientific application workflow. Creative Commons Attribution License

  19. NASA Langley Atmospheric Science Data Center (ASDC) Experience with Aircraft Data

    NASA Astrophysics Data System (ADS)

    Perez, J.; Sorlie, S.; Parker, L.; Mason, K. L.; Rinsland, P.; Kusterer, J.

    2011-12-01

    Over the past decade the NASA Langley ASDC has archived and distributed a variety of aircraft mission data sets. These datasets posed unique challenges for archiving from the rigidity of the archiving system and formats to the lack of metadata. The ASDC developed a state-of-the-art data archive and distribution system to serve the atmospheric sciences data provider and researcher communities. The system, called Archive - Next Generation (ANGe), is designed with a distributed, multi-tier, serviced-based, message oriented architecture enabling new methods for searching, accessing, and customizing data. The ANGe system provides the ease and flexibility to ingest and archive aircraft data through an ad hoc workflow or to develop a new workflow to suit the providers needs. The ASDC will describe the challenges encountered in preparing aircraft data for archiving and distribution. The ASDC is currently providing guidance to the DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) Earth Venture-1 project on developing collection, granule, and browse metadata as well as supporting the ADAM (Airborne Data For Assessing Models) site.

  20. Economic, Educational, and Personal Implications of Implementing Computerized Guidance Information Systems. Information System for Vocational Decisions.

    ERIC Educational Resources Information Center

    Tiedeman, David V.

    The author asserts that financial support of guidance activities, the job of the counselor, and counselors themselves will all have to change if computerized guidance support systems are to come into widespread use. The potential costs, benefits, and operating economics are discussed. Needed educational reorganization is dealt with on several…

  1. A versatile mathematical work-flow to explore how Cancer Stem Cell fate influences tumor progression.

    PubMed

    Fornari, Chiara; Balbo, Gianfranco; Halawani, Sami M; Ba-Rukab, Omar; Ahmad, Ab Rahman; Calogero, Raffaele A; Cordero, Francesca; Beccuti, Marco

    2015-01-01

    Nowadays multidisciplinary approaches combining mathematical models with experimental assays are becoming relevant for the study of biological systems. Indeed, in cancer research multidisciplinary approaches are successfully used to understand the crucial aspects implicated in tumor growth. In particular, the Cancer Stem Cell (CSC) biology represents an area particularly suited to be studied through multidisciplinary approaches, and modeling has significantly contributed to pinpoint the crucial aspects implicated in this theory. More generally, to acquire new insights on a biological system it is necessary to have an accurate description of the phenomenon, such that making accurate predictions on its future behaviors becomes more likely. In this context, the identification of the parameters influencing model dynamics can be advantageous to increase model accuracy and to provide hints in designing wet experiments. Different techniques, ranging from statistical methods to analytical studies, have been developed. Their applications depend on case-specific aspects, such as the availability and quality of experimental data, and the dimension of the parameter space. The study of a new model on the CSC-based tumor progression has been the motivation to design a new work-flow that helps to characterize possible system dynamics and to identify those parameters influencing such behaviors. In detail, we extended our recent model on CSC-dynamics creating a new system capable of describing tumor growth during the different stages of cancer progression. Indeed, tumor cells appear to progress through lineage stages like those of normal tissues, being their division auto-regulated by internal feedback mechanisms. These new features have introduced some non-linearities in the model, making it more difficult to be studied by solely analytical techniques. Our new work-flow, based on statistical methods, was used to identify the parameters which influence the tumor growth. The effectiveness of the presented work-flow was firstly verified on two well known models and then applied to investigate our extended CSC model. We propose a new work-flow to study in a practical and informative way complex systems, allowing an easy identification, interpretation, and visualization of the key model parameters. Our methodology is useful to investigate possible model behaviors and to establish factors driving model dynamics. Analyzing our new CSC model guided by the proposed work-flow, we found that the deregulation of CSC asymmetric proliferation contributes to cancer initiation, in accordance with several experimental evidences. Specifically, model results indicated that the probability of CSC symmetric proliferation is responsible of a switching-like behavior which discriminates between tumorigenesis and unsustainable tumor growth.

  2. MO-B-BRC-01: Introduction [Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prisciandaro, J.

    2016-06-15

    Brachytherapy has proven to be an effective treatment option for prostate cancer. Initially, prostate brachytherapy was delivered through permanently implanted low dose rate (LDR) radioactive sources; however, high dose rate (HDR) temporary brachytherapy for prostate cancer is gaining popularity. Needle insertion during prostate brachytherapy is most commonly performed under ultrasound (U/S) guidance; however, treatment planning may be performed utilizing several imaging modalities either in an intra- or post-operative setting. During intra-operative prostate HDR, the needles are imaged during implantation, and planning may be performed in real time. At present, the most common imaging modality utilized for intra-operative prostate HDR ismore » U/S. Alternatively, in the post-operative setting, following needle implantation, patients may be simulated with computed tomography (CT) or magnetic resonance imaging (MRI). Each imaging modality and workflow provides its share of benefits and limitations. Prostate HDR has been adopted in a number of cancer centers across the nation. In this educational session, we will explore the role of U/S, CT, and MRI in HDR prostate brachytherapy. Example workflows and operational details will be shared, and we will discuss how to establish a prostate HDR program in a clinical setting. Learning Objectives: Review prostate HDR techniques based on the imaging modality Discuss the challenges and pitfalls introduced by the three imagebased options for prostate HDR brachytherapy Review the QA process and learn about the development of clinical workflows for these imaging options at different institutions.« less

  3. MO-B-BRC-04: MRI-Based Prostate HDR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mourtada, F.

    2016-06-15

    Brachytherapy has proven to be an effective treatment option for prostate cancer. Initially, prostate brachytherapy was delivered through permanently implanted low dose rate (LDR) radioactive sources; however, high dose rate (HDR) temporary brachytherapy for prostate cancer is gaining popularity. Needle insertion during prostate brachytherapy is most commonly performed under ultrasound (U/S) guidance; however, treatment planning may be performed utilizing several imaging modalities either in an intra- or post-operative setting. During intra-operative prostate HDR, the needles are imaged during implantation, and planning may be performed in real time. At present, the most common imaging modality utilized for intra-operative prostate HDR ismore » U/S. Alternatively, in the post-operative setting, following needle implantation, patients may be simulated with computed tomography (CT) or magnetic resonance imaging (MRI). Each imaging modality and workflow provides its share of benefits and limitations. Prostate HDR has been adopted in a number of cancer centers across the nation. In this educational session, we will explore the role of U/S, CT, and MRI in HDR prostate brachytherapy. Example workflows and operational details will be shared, and we will discuss how to establish a prostate HDR program in a clinical setting. Learning Objectives: Review prostate HDR techniques based on the imaging modality Discuss the challenges and pitfalls introduced by the three imagebased options for prostate HDR brachytherapy Review the QA process and learn about the development of clinical workflows for these imaging options at different institutions.« less

  4. MO-B-BRC-00: Prostate HDR Treatment Planning - Considering Different Imaging Modalities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    Brachytherapy has proven to be an effective treatment option for prostate cancer. Initially, prostate brachytherapy was delivered through permanently implanted low dose rate (LDR) radioactive sources; however, high dose rate (HDR) temporary brachytherapy for prostate cancer is gaining popularity. Needle insertion during prostate brachytherapy is most commonly performed under ultrasound (U/S) guidance; however, treatment planning may be performed utilizing several imaging modalities either in an intra- or post-operative setting. During intra-operative prostate HDR, the needles are imaged during implantation, and planning may be performed in real time. At present, the most common imaging modality utilized for intra-operative prostate HDR ismore » U/S. Alternatively, in the post-operative setting, following needle implantation, patients may be simulated with computed tomography (CT) or magnetic resonance imaging (MRI). Each imaging modality and workflow provides its share of benefits and limitations. Prostate HDR has been adopted in a number of cancer centers across the nation. In this educational session, we will explore the role of U/S, CT, and MRI in HDR prostate brachytherapy. Example workflows and operational details will be shared, and we will discuss how to establish a prostate HDR program in a clinical setting. Learning Objectives: Review prostate HDR techniques based on the imaging modality Discuss the challenges and pitfalls introduced by the three imagebased options for prostate HDR brachytherapy Review the QA process and learn about the development of clinical workflows for these imaging options at different institutions.« less

  5. MO-B-BRC-02: Ultrasound Based Prostate HDR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Z.

    2016-06-15

    Brachytherapy has proven to be an effective treatment option for prostate cancer. Initially, prostate brachytherapy was delivered through permanently implanted low dose rate (LDR) radioactive sources; however, high dose rate (HDR) temporary brachytherapy for prostate cancer is gaining popularity. Needle insertion during prostate brachytherapy is most commonly performed under ultrasound (U/S) guidance; however, treatment planning may be performed utilizing several imaging modalities either in an intra- or post-operative setting. During intra-operative prostate HDR, the needles are imaged during implantation, and planning may be performed in real time. At present, the most common imaging modality utilized for intra-operative prostate HDR ismore » U/S. Alternatively, in the post-operative setting, following needle implantation, patients may be simulated with computed tomography (CT) or magnetic resonance imaging (MRI). Each imaging modality and workflow provides its share of benefits and limitations. Prostate HDR has been adopted in a number of cancer centers across the nation. In this educational session, we will explore the role of U/S, CT, and MRI in HDR prostate brachytherapy. Example workflows and operational details will be shared, and we will discuss how to establish a prostate HDR program in a clinical setting. Learning Objectives: Review prostate HDR techniques based on the imaging modality Discuss the challenges and pitfalls introduced by the three imagebased options for prostate HDR brachytherapy Review the QA process and learn about the development of clinical workflows for these imaging options at different institutions.« less

  6. Development and validation of a new guidance device for lateral approach stereotactic breast biopsy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, K.; Kornecki, A.; Bax, J.

    2009-06-15

    Stereotactic breast biopsy (SBB) is the gold standard for minimally invasive breast cancer diagnosis. Current systems rely on one of two methods for needle insertion: A vertical approach (perpendicular to the breast compression plate) or a lateral approach (parallel to the compression plate). While the vertical approach is more frequently used, it is not feasible in patients with thin breasts (<3 cm thick after compression) or with superficial lesions. Further, existing SBB guidance hardware provides at most one degree of rotational freedom in the needle trajectory, and as such requires a separate skin incision for each biopsy target. The authorsmore » present a new design of lateral guidance device for SBB, which addresses the limitations of the vertical approach and provides improvements over the existing lateral guidance hardware. Specifically, the new device provides (1) an adjustable rigid needle support to minimize needle deflection within the breast and (2) an additional degree of rotational freedom in the needle trajectory, allowing the radiologist to sample multiple targets through a single skin incision. This device was compared to a commercial lateral guidance device in a series of phantom experiments. Needle placement error using each device was measured in agar phantoms for needle insertions at lateral depths of 2 and 5 cm. The biopsy success rate for each device was then estimated by performing biopsy procedures in commercial SBB phantoms. SBB performed with the new lateral guidance device provided reduced needle placement error relative to the commercial lateral guidance device (0.89{+-}0.22 vs 1.75{+-}0.35 mm for targets at 2 cm depth; 1.94{+-}0.20 vs 3.21{+-}0.31 mm for targets at 5 cm depth). The new lateral guidance device also provided improved biopsy accuracy in SBB procedures compared to the commercial lateral guidance device (100% vs 58% success rate). Finally, experiments were performed to demonstrate that the new device can accurately sample lesions within thin breast phantoms and multiple lesions through a single incision point. This device can be incorporated directly into the clinical SBB procedural workflow, with no additional electrical hardware, software, postprocessing, or image analysis.« less

  7. First clinical use of the EchoTrack guidance approach for radiofrequency ablation of thyroid gland nodules.

    PubMed

    Franz, Alfred Michael; Seitel, Alexander; Bopp, Nasrin; Erbelding, Christian; Cheray, Dominique; Delorme, Stefan; Grünwald, Frank; Korkusuz, Hüdayi; Maier-Hein, Lena

    2017-06-01

    Percutaneous radiofrequency ablation (RFA) of thyroid nodules is an alternative to surgical resection that offers the benefits of minimal scars for the patient, lower complication rates, and shorter treatment times. Ultrasound (US) is the preferred modality for guiding these procedures. The needle is usually kept within the US scanning plane to ensure needle visibility. However, this restricts flexibility in both transducer and needle movement and renders the procedure difficult, especially for inexperienced users. Existing navigation solutions often involve electromagnetic (EM) tracking, which requires placement of an external field generator (FG) in close proximity of the intervention site in order to avoid distortion of the EM field. This complicates the clinical workflow as placing the FG while ensuring that it neither restricts the physician's workspace nor affects tracking accuracy is awkward and time-consuming. The EchoTrack concept overcomes these issues by combining the US probe and the EM FG in one modality, simultaneously providing both real-time US and tracking data without requiring the placement of an external FG for tracking. We propose a system and workflow to use EchoTrack for RFA of thyroid nodules. According to our results, the overall error of the EchoTrack system resulting from errors related to tracking and calibration is below 2 mm. Navigated thyroid RFA with the proposed concept is clinically feasible. Motion of internal critical structures relative to external markers can be up to several millimeters in extreme cases. The EchoTrack concept with its simple setup, flexibility, improved needle visualization, and additional guidance information has high potential to be clinically used for thyroid RFA.

  8. Future of medical physics: Real-time MRI-guided proton therapy.

    PubMed

    Oborn, Bradley M; Dowdell, Stephen; Metcalfe, Peter E; Crozier, Stuart; Mohan, Radhe; Keall, Paul J

    2017-08-01

    With the recent clinical implementation of real-time MRI-guided x-ray beam therapy (MRXT), attention is turning to the concept of combining real-time MRI guidance with proton beam therapy; MRI-guided proton beam therapy (MRPT). MRI guidance for proton beam therapy is expected to offer a compelling improvement to the current treatment workflow which is warranted arguably more than for x-ray beam therapy. This argument is born out of the fact that proton therapy toxicity outcomes are similar to that of the most advanced IMRT treatments, despite being a fundamentally superior particle for cancer treatment. In this Future of Medical Physics article, we describe the various software and hardware aspects of potential MRPT systems and the corresponding treatment workflow. Significant software developments, particularly focused around adaptive MRI-based planning will be required. The magnetic interaction between the MRI and the proton beamline components will be a key area of focus. For example, the modeling and potential redesign of a magnetically compatible gantry to allow for beam delivery from multiple angles towards a patient located within the bore of an MRI scanner. Further to this, the accuracy of pencil beam scanning and beam monitoring in the presence of an MRI fringe field will require modeling, testing, and potential further development to ensure that the highly targeted radiotherapy is maintained. Looking forward we envisage a clear and accelerated path for hardware development, leveraging from lessons learnt from MRXT development. Within few years, simple prototype systems will likely exist, and in a decade, we could envisage coupled systems with integrated gantries. Such milestones will be key in the development of a more efficient, more accurate, and more successful form of proton beam therapy for many common cancer sites. © 2017 American Association of Physicists in Medicine.

  9. Implications of Weltanschauungen for Value Formulation by Guidance Personnel

    ERIC Educational Resources Information Center

    Weinstock, Henry R.; O'Dowd, Peter S.

    1970-01-01

    Examines two divergent world-views (empiricism and ration alism) of human values, presenting definitions from many philosophers. Anticipates possibility of confluence of beliefs on nature of man which may offer guidance personnel more accurate tools for direction. (CJ)

  10. Visual tracking for multi-modality computer-assisted image guidance

    NASA Astrophysics Data System (ADS)

    Basafa, Ehsan; Foroughi, Pezhman; Hossbach, Martin; Bhanushali, Jasmine; Stolka, Philipp

    2017-03-01

    With optical cameras, many interventional navigation tasks previously relying on EM, optical, or mechanical guidance can be performed robustly, quickly, and conveniently. We developed a family of novel guidance systems based on wide-spectrum cameras and vision algorithms for real-time tracking of interventional instruments and multi-modality markers. These navigation systems support the localization of anatomical targets, support placement of imaging probe and instruments, and provide fusion imaging. The unique architecture - low-cost, miniature, in-hand stereo vision cameras fitted directly to imaging probes - allows for an intuitive workflow that fits a wide variety of specialties such as anesthesiology, interventional radiology, interventional oncology, emergency medicine, urology, and others, many of which see increasing pressure to utilize medical imaging and especially ultrasound, but have yet to develop the requisite skills for reliable success. We developed a modular system, consisting of hardware (the Optical Head containing the mini cameras) and software (components for visual instrument tracking with or without specialized visual features, fully automated marker segmentation from a variety of 3D imaging modalities, visual observation of meshes of widely separated markers, instant automatic registration, and target tracking and guidance on real-time multi-modality fusion views). From these components, we implemented a family of distinct clinical and pre-clinical systems (for combinations of ultrasound, CT, CBCT, and MRI), most of which have international regulatory clearance for clinical use. We present technical and clinical results on phantoms, ex- and in-vivo animals, and patients.

  11. A 3D visualization and guidance system for handheld optical imaging devices

    NASA Astrophysics Data System (ADS)

    Azar, Fred S.; de Roquemaurel, Benoit; Cerussi, Albert; Hajjioui, Nassim; Li, Ang; Tromberg, Bruce J.; Sauer, Frank

    2007-03-01

    We have developed a novel 3D visualization and guidance system for handheld optical imaging devices. In this paper, the system is applied to measurements of breast/cancerous tissue optical properties using a handheld diffuse optical spectroscopy (DOS) instrument. The combined guidance system/DOS instrument becomes particularly useful for monitoring neoadjuvant chemotherapy in breast cancer patients and for longitudinal studies where measurement reproducibility is critical. The system uses relatively inexpensive hardware components and comprises a 6 degrees-of-freedom (DOF) magnetic tracking device including a DC field generator, three sensors, and a PCI card running on a PC workstation. A custom-built virtual environment combined with a well-defined workflow provide the means for image-guided measurements, improved longitudinal studies of breast optical properties, 3D reconstruction of optical properties within the anatomical map, and serial data registration. The DOS instrument characterizes tissue function such as water, lipid and total hemoglobin concentration. The patient lies on her back at a 45-degrees angle. Each spectral measurement requires consistent contact with the skin, and lasts about 5-10 seconds. Therefore a limited number of positions may be studied. In a reference measurement session, the physician acquires surface points on the breast. A Delaunay-based triangulation algorithm is used to build the virtual breast surface from the acquired points. 3D locations of all DOS measurements are recorded. All subsequently acquired surfaces are automatically registered to the reference surface, thus allowing measurement reproducibility through image guidance using the reference measurements.

  12. Information Security: Governmentwide Guidance Needed to Assist Agencies in Implementing Cloud Computing

    DTIC Science & Technology

    2010-07-01

    Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to

  13. Interventional-Cardiovascular MR: Role of the Interventional MR Technologist

    PubMed Central

    Mazal, Jonathan R; Rogers, Toby; Schenke, William H; Faranesh, Anthony Z; Hansen, Michael; O’Brien, Kendall; Ratnayaka, Kanishka; Lederman, Robert J

    2016-01-01

    Background Interventional-cardiovascular magnetic resonance (iCMR) is a promising clinical tool for adults and children who need a comprehensive hemodynamic catheterization of the heart. Magnetic resonance (MR) imaging-guided cardiac catheterization offers radiation-free examination with increased soft tissue contrast and unconstrained imaging planes for catheter guidance. The interventional MR technologist plays an important role in the care of patients undergoing such procedures. It is therefore helpful for technologists to under-stand the unique iCMR preprocedural preparation, procedural and imaging workflows, and management of emergencies. The authors report their team’s experience from the National Institutes of Health Clinical Center and a collaborating pediatric site. PMID:26721838

  14. Seven [Data] Habits of Highly Successful Researchers

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Shepherd, A.; Saito, M. A.; Wiebe, P. H.; Ake, H.; Biddle, M.; Copley, N. J.; Rauch, S.; Switzer, M. E.; York, A.

    2017-12-01

    Navigating the landscape of open science and data sharing can be daunting for the long-tail scientist. From satisfying funder requirements, and ensuring proper attribution for their work, to determining the best repository for data management and archive, there are several facets to be considered. Yet, there is no single source of guidance for investigators who may be using multiple research funding models. What role can existing repositories play to help facilitate a more effective data sharing workflow? The Biological and Chemical Oceanographic Data Management Office (BCO-DMO) is a domain-specific repository occupying the niche between funder and investigator. The office works closely with its stakeholders to develop and provide guidance, services, and tools that assist researchers in meeting their data sharing needs. From determining if BCO-DMO is the appropriate repository to manage an investigator's project data, to ensuring that investigator is able to fulfill funder requirements. The goal is to relieve the investigator of the more difficult aspects of data management and data sharing, while simultaneously educating them in better data management practices that will streamline the process of conducting open research in the future. This presentation will provide an overview of the BCO-DMO repository, highlighting some of the services and guidance the office provides to its community.

  15. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    PubMed Central

    Ho, Jonhan; Aridor, Orly; Parwani, Anil V.

    2012-01-01

    Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users’ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists’ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number of processes that may possibly improve following the introduction of digital pathology systems were identified. These work processes included case management, case examination and review, and final case reporting. Furthermore, a digital slide system should integrate with the anatomic pathologic laboratory information system. Conclusions: To our knowledge, this is the first study that utilized the contextual inquiry method to document AP workflow. Findings were used to establish key requirements for the design of digital pathology systems. PMID:23243553

  16. Critical care physician cognitive task analysis: an exploratory study

    PubMed Central

    Fackler, James C; Watts, Charles; Grome, Anna; Miller, Thomas; Crandall, Beth; Pronovost, Peter

    2009-01-01

    Introduction For better or worse, the imposition of work-hour limitations on house-staff has imperiled continuity and/or improved decision-making. Regardless, the workflow of every physician team in every academic medical centre has been irrevocably altered. We explored the use of cognitive task analysis (CTA) techniques, most commonly used in other high-stress and time-sensitive environments, to analyse key cognitive activities in critical care medicine. The study objective was to assess the usefulness of CTA as an analytical tool in order that physician cognitive tasks may be understood and redistributed within the work-hour limited medical decision-making teams. Methods After approval from each Institutional Review Board, two intensive care units (ICUs) within major university teaching hospitals served as data collection sites for CTA observations and interviews of critical care providers. Results Five broad categories of cognitive activities were identified: pattern recognition; uncertainty management; strategic vs. tactical thinking; team coordination and maintenance of common ground; and creation and transfer of meaning through stories. Conclusions CTA within the framework of Naturalistic Decision Making is a useful tool to understand the critical care process of decision-making and communication. The separation of strategic and tactical thinking has implications for workflow redesign. Given the global push for work-hour limitations, such workflow redesign is occurring. Further work with CTA techniques will provide important insights toward rational, rather than random, workflow changes. PMID:19265517

  17. Critical care physician cognitive task analysis: an exploratory study.

    PubMed

    Fackler, James C; Watts, Charles; Grome, Anna; Miller, Thomas; Crandall, Beth; Pronovost, Peter

    2009-01-01

    For better or worse, the imposition of work-hour limitations on house-staff has imperiled continuity and/or improved decision-making. Regardless, the workflow of every physician team in every academic medical centre has been irrevocably altered. We explored the use of cognitive task analysis (CTA) techniques, most commonly used in other high-stress and time-sensitive environments, to analyse key cognitive activities in critical care medicine. The study objective was to assess the usefulness of CTA as an analytical tool in order that physician cognitive tasks may be understood and redistributed within the work-hour limited medical decision-making teams. After approval from each Institutional Review Board, two intensive care units (ICUs) within major university teaching hospitals served as data collection sites for CTA observations and interviews of critical care providers. Five broad categories of cognitive activities were identified: pattern recognition; uncertainty management; strategic vs. tactical thinking; team coordination and maintenance of common ground; and creation and transfer of meaning through stories. CTA within the framework of Naturalistic Decision Making is a useful tool to understand the critical care process of decision-making and communication. The separation of strategic and tactical thinking has implications for workflow redesign. Given the global push for work-hour limitations, such workflow redesign is occurring. Further work with CTA techniques will provide important insights toward rational, rather than random, workflow changes.

  18. Automation in an addiction treatment research clinic: computerised contingency management, ecological momentary assessment and a protocol workflow system.

    PubMed

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L

    2009-01-01

    A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18,000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods.

  19. Developing Local Lifelong Guidance Strategies.

    ERIC Educational Resources Information Center

    Watts, A. G.; Hawthorn, Ruth; Hoffbrand, Jill; Jackson, Heather; Spurling, Andrea

    1997-01-01

    Outlines the background, rationale, methodology, and outcomes of developing local lifelong guidance strategies in four geographic areas. Analyzes the main components of the strategies developed and addresses a number of issues relating to the process of strategy development. Explores implications for parallel work in other localities. (RJM)

  20. Netrins and UNC5 receptors in angiogenesis.

    PubMed

    Freitas, Catarina; Larrivée, Bruno; Eichmann, Anne

    2008-01-01

    Both neuronal and vascular development require guidance to establish a precise branching pattern of these systems in the vertebrate body. Several molecules implicated in axon navigation have also been shown to regulate vessel sprouting. Among these guidance cues, Netrins constitute a family of diffusible molecules with a bifuncional role in axon pathfinding. Recent findings implicate Netrins in other developmental processes, including vascular development. We here review recent studies and discuss the possible dual function of Netrins and its receptors during branching of blood vessels in developmental and pathological angiogenesis.

  1. Career Trajectories of Older Women: Implications for Career Guidance

    ERIC Educational Resources Information Center

    Bimrose, Jenny; McMahon, Mary; Watson, Mark

    2013-01-01

    As work and employment transitions become more frequent and difficult, the demand for formal career guidance increases. Women are likely to experience structural labour market disadvantage and may benefit from formal support that is sympathetic to their particular needs. Yet the traditional psychological paradigms that dominate career guidance…

  2. Flexibility "and" Security? "Flexicurity" and its Implications for Lifelong Guidance

    ERIC Educational Resources Information Center

    Sultana, Ronald G.

    2013-01-01

    This article sets out to trigger research and policy attention among the career guidance community to the increasingly important notion of "flexicurity". It first explores the different meanings of the term, particularly as these have evolved in discussions across the European Union. It then goes on to consider why…

  3. A robotic C-arm cone beam CT system for image-guided proton therapy: design and performance.

    PubMed

    Hua, Chiaho; Yao, Weiguang; Kidani, Takao; Tomida, Kazuo; Ozawa, Saori; Nishimura, Takenori; Fujisawa, Tatsuya; Shinagawa, Ryousuke; Merchant, Thomas E

    2017-11-01

    A ceiling-mounted robotic C-arm cone beam CT (CBCT) system was developed for use with a 190° proton gantry system and a 6-degree-of-freedom robotic patient positioner. We report on the mechanical design, system accuracy, image quality, image guidance accuracy, imaging dose, workflow, safety and collision-avoidance. The robotic CBCT system couples a rotating C-ring to the C-arm concentrically with a kV X-ray tube and a flat-panel imager mounted to the C-ring. CBCT images are acquired with flex correction and maximally 360° rotation for a 53 cm field of view. The system was designed for clinical use with three imaging locations. Anthropomorphic phantoms were imaged to evaluate the image guidance accuracy. The position accuracy and repeatability of the robotic C-arm was high (<0.5 mm), as measured with a high-accuracy laser tracker. The isocentric accuracy of the C-ring rotation was within 0.7 mm. The coincidence of CBCT imaging and radiation isocentre was better than 1 mm. The average image guidance accuracy was within 1 mm and 1° for the anthropomorphic phantoms tested. Daily volumetric imaging for proton patient positioning was specified for routine clinical practice. Our novel gantry-independent robotic CBCT system provides high-accuracy volumetric image guidance for proton therapy. Advances in knowledge: Ceiling-mounted robotic CBCT provides a viable option than CT on-rails for partial gantry and fixed-beam proton systems with the added advantage of acquiring images at the treatment isocentre.

  4. Pneumatically Operated MRI-Compatible Needle Placement Robot for Prostate Interventions

    PubMed Central

    Fischer, Gregory S.; Iordachita, Iulian; Csoma, Csaba; Tokuda, Junichi; Mewes, Philip W.; Tempany, Clare M.; Hata, Nobuhiko; Fichtinger, Gabor

    2011-01-01

    Magnetic Resonance Imaging (MRI) has potential to be a superior medical imaging modality for guiding and monitoring prostatic interventions. The strong magnetic field prevents the use of conventional mechatronics and the confined physical space makes it extremely challenging to access the patient. We have designed a robotic assistant system that overcomes these difficulties and promises safe and reliable intra-prostatic needle placement inside closed high-field MRI scanners. The robot performs needle insertion under real-time 3T MR image guidance; workspace requirements, MR compatibility, and workflow have been evaluated on phantoms. The paper explains the robot mechanism and controller design and presents results of preliminary evaluation of the system. PMID:21686038

  5. Pneumatically Operated MRI-Compatible Needle Placement Robot for Prostate Interventions.

    PubMed

    Fischer, Gregory S; Iordachita, Iulian; Csoma, Csaba; Tokuda, Junichi; Mewes, Philip W; Tempany, Clare M; Hata, Nobuhiko; Fichtinger, Gabor

    2008-06-13

    Magnetic Resonance Imaging (MRI) has potential to be a superior medical imaging modality for guiding and monitoring prostatic interventions. The strong magnetic field prevents the use of conventional mechatronics and the confined physical space makes it extremely challenging to access the patient. We have designed a robotic assistant system that overcomes these difficulties and promises safe and reliable intra-prostatic needle placement inside closed high-field MRI scanners. The robot performs needle insertion under real-time 3T MR image guidance; workspace requirements, MR compatibility, and workflow have been evaluated on phantoms. The paper explains the robot mechanism and controller design and presents results of preliminary evaluation of the system.

  6. Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.

    An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less

  7. Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling

    DOE PAGES

    Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; ...

    2018-05-07

    An integrated-modeling workflow has been developed in this paper for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape andmore » various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q 95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q 95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. In conclusion, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.« less

  8. Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling

    NASA Astrophysics Data System (ADS)

    Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; Lao, L. L.; Weisberg, D. B.; Belli, E. A.; Evans, T. E.; Ferraro, N. M.; Snyder, P. B.

    2018-05-01

    An integrated-modeling workflow has been developed for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape and various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q95. On the other hand, the details of the pedestal current did not significantly impact the plasma response in these equilibria. A comparison to the experimentally measured plasma response shows mixed agreement, indicating that while the equilibria are predicted well, additional analysis tools may be needed. Finally, we note the implications that these results have for the success of future predict-first studies, particularly the need for scans of uncertain parameters and for close collaboration between experimentalists and theorists.

  9. An Exploration of Female Travellers' Experiences of Guidance Counselling in Adult Education

    ERIC Educational Resources Information Center

    Doyle, Anne; Hearne, Lucy

    2012-01-01

    The proposed changes in the further education sector, including the rationalisation of the VEC into Local Education and Training Boards (LETBs) and the closures of the Senior Traveller Training Centres (STTCs), have implications for guidance counselling provision to the Traveller community. This article discusses female Travellers' experiences of…

  10. Key Competencies: How School Guidance Counsellors Contribute to Student Learning

    ERIC Educational Resources Information Center

    Crocket, Kathie; Kotzé, Elmarie; Hughes, Colin; Graham, Judith; Burke, Alison

    2014-01-01

    Schools are currently working through the implications of the New Zealand Curriculum and its translations into practice. To date there has been little discussion of the contributions of school guidance counseling. For learning and teaching to become a collective, whole-school endeavour, Cowie at el., (2011) suggested, "cross-fertilisation of…

  11. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern

    NASA Astrophysics Data System (ADS)

    Borot de Battisti, M.; Maenhout, M.; de Senneville, B. Denis; Hautvast, G.; Binnekamp, D.; Lagendijk, J. J. W.; van Vulpen, M.; Moerland, M. A.

    2015-10-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance. However, MR-guided needle placement is currently not possible due to space restrictions in the closed MR bore. To overcome this problem, a MR-compatible, single-divergent needle-implant robotic device is under development at the University Medical Centre, Utrecht: placed between the legs of the patient inside the MR bore, this robot will tap the needle in a divergent pattern from a single rotation point into the tissue. This rotation point is just beneath the perineal skin to have access to the focal prostate tumor lesion. Currently, there is no treatment planning system commercially available which allows optimization of the dose distribution with such needle arrangement. The aim of this work is to develop an automatic inverse dose planning optimization tool for focal HDR prostate brachytherapy with needle insertions in a divergent configuration. A complete optimizer workflow is proposed which includes the determination of (1) the position of the center of rotation, (2) the needle angulations and (3) the dwell times. Unlike most currently used optimizers, no prior selection or adjustment of input parameters such as minimum or maximum dose or weight coefficients for treatment region and organs at risk is required. To test this optimizer, a planning study was performed on ten patients (treatment volumes ranged from 8.5 cm3to 23.3 cm3) by using 2-14 needle insertions. The total computation time of the optimizer workflow was below 20 min and a clinically acceptable plan was reached on average using only four needle insertions.

  12. Validation of high-throughput single cell analysis methodology.

    PubMed

    Devonshire, Alison S; Baradez, Marc-Olivier; Morley, Gary; Marshall, Damian; Foy, Carole A

    2014-05-01

    High-throughput quantitative polymerase chain reaction (qPCR) approaches enable profiling of multiple genes in single cells, bringing new insights to complex biological processes and offering opportunities for single cell-based monitoring of cancer cells and stem cell-based therapies. However, workflows with well-defined sources of variation are required for clinical diagnostics and testing of tissue-engineered products. In a study of neural stem cell lines, we investigated the performance of lysis, reverse transcription (RT), preamplification (PA), and nanofluidic qPCR steps at the single cell level in terms of efficiency, precision, and limit of detection. We compared protocols using a separate lysis buffer with cell capture directly in RT-PA reagent. The two methods were found to have similar lysis efficiencies, whereas the direct RT-PA approach showed improved precision. Digital PCR was used to relate preamplified template copy numbers to Cq values and reveal where low-quality signals may affect the analysis. We investigated the impact of calibration and data normalization strategies as a means of minimizing the impact of inter-experimental variation on gene expression values and found that both approaches can improve data comparability. This study provides validation and guidance for the application of high-throughput qPCR workflows for gene expression profiling of single cells. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Design considerations for a novel MRI compatible manipulator for prostate cryoablation.

    PubMed

    Abdelaziz, S; Esteveny, L; Renaud, P; Bayle, B; Barbé, L; De Mathelin, M; Gangi, A

    2011-11-01

    Prostate carcinoma is a commonly diagnosed cancer in men. Nonsurgical treatment of early stage prostate cancer is an important alternative. The use of MRI for tumor cryoablation is of particular interest: it offers lower morbidity compared with other localized techniques. However, the current manual procedure is very time-consuming and has limited accuracy. A novel robotic assistant is therefore designed for prostate cancer cryotherapy treatment under MRI guidance to improve efficiency and accuracy. Gesture definition was achieved based on actions of interventional radiologists at University Hospital of Strasbourg. A transperineal approach with a semiautonomous prostatic cryoprobe localization procedure was developed where the needle axis is automatically positioned before manual insertion. The workflow was developed simultaneously with the robotic assistant used for needle positioning. The design and the associated workflow of an original wire-driven manipulator were developed. The device is compact and has a low weight: its overall dimensions in the scanner are 100 × 100 × 40 mm with a weight of 120 g. Very good MRI compatibility was demonstrated. A novel cryoablation procedure based on the use of a robotic assistant is proposed. The device design was presented with demonstration of MRI compatibility. Further developments include automatic registration and in vivo experimental testing.

  14. Augmented Reality Based Navigation for Computer Assisted Hip Resurfacing: A Proof of Concept Study.

    PubMed

    Liu, He; Auvinet, Edouard; Giles, Joshua; Rodriguez Y Baena, Ferdinando

    2018-05-23

    Implantation accuracy has a great impact on the outcomes of hip resurfacing such as recovery of hip function. Computer assisted orthopedic surgery has demonstrated clear advantages for the patients, with improved placement accuracy and fewer outliers, but the intrusiveness, cost, and added complexity have limited its widespread adoption. To provide seamless computer assistance with improved immersion and a more natural surgical workflow, we propose an augmented-reality (AR) based navigation system for hip resurfacing. The operative femur is registered by processing depth information from the surgical site with a commercial depth camera. By coupling depth data with robotic assistance, obstacles that may obstruct the femur can be tracked and avoided automatically to reduce the chance of disruption to the surgical workflow. Using the registration result and the pre-operative plan, intra-operative surgical guidance is provided through a commercial AR headset so that the user can perform the operation without additional physical guides. To assess the accuracy of the navigation system, experiments of guide hole drilling were performed on femur phantoms. The position and orientation of the drilled holes were compared with the pre-operative plan, and the mean errors were found to be approximately 2 mm and 2°, results which are in line with commercial computer assisted orthopedic systems today.

  15. [Intelligent operating room suite : From passive medical devices to the self-thinking cognitive surgical assistant].

    PubMed

    Kenngott, H G; Wagner, M; Preukschas, A A; Müller-Stich, B P

    2016-12-01

    Modern operating room (OR) suites are mostly digitally connected but until now the primary focus was on the presentation, transfer and distribution of images. Device information and processes within the operating theaters are barely considered. Cognitive assistance systems have triggered a fundamental rethinking in the automotive industry as well as in logistics. In principle, tasks in the OR, some of which are highly repetitive, also have great potential to be supported by automated cognitive assistance via a self-thinking system. This includes the coordination of the entire workflow in the perioperative process in both the operating theater and the whole hospital. With corresponding data from hospital information systems, medical devices and appropriate models of the surgical process, intelligent systems could optimize the workflow in the operating theater in the near future and support the surgeon. Preliminary results on the use of device information and automatically controlled OR suites are already available. Such systems include, for example the guidance of laparoscopic camera systems. Nevertheless, cognitive assistance systems that make use of knowledge about patients, processes and other pieces of information to improve surgical treatment are not yet available in the clinical routine but are urgently needed in order to automatically assist the surgeon in situation-related activities and thus substantially improve patient care.

  16. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    PubMed

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  17. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  18. ROS-based ground stereo vision detection: implementation and experiments.

    PubMed

    Hu, Tianjiang; Zhao, Boxin; Tang, Dengqing; Zhang, Daibing; Kong, Weiwei; Shen, Lincheng

    This article concentrates on open-source implementation on flying object detection in cluttered scenes. It is of significance for ground stereo-aided autonomous landing of unmanned aerial vehicles. The ground stereo vision guidance system is presented with details on system architecture and workflow. The Chan-Vese detection algorithm is further considered and implemented in the robot operating systems (ROS) environment. A data-driven interactive scheme is developed to collect datasets for parameter tuning and performance evaluating. The flying vehicle outdoor experiments capture the stereo sequential images dataset and record the simultaneous data from pan-and-tilt unit, onboard sensors and differential GPS. Experimental results by using the collected dataset validate the effectiveness of the published ROS-based detection algorithm.

  19. Implications of an emerging EHR monoculture for hospitals and healthcare systems.

    PubMed

    Koppel, Ross; Lehmann, Christoph U

    2015-03-01

    In many hospitals and health systems, a 'new' electronic health record means a shift to one vendor: Epic, a vendor that dominates in large and medium hospital markets and continues its success with smaller institutions and ambulatory practices. Our paper examines the implications of this emerging monoculture: its advantages and disadvantages for physicians and hospitals and its role in innovation, professional autonomy, implementation difficulties, workflow, flexibility, cost, data standards, interoperability, and interactions with other information technology (IT) systems. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. The Paradigm Shift of Vocational Guidance and Career Counseling and Its Implications for Turkey: An Evaluation from Past to Future

    ERIC Educational Resources Information Center

    Yesilyaprak, Binnur

    2012-01-01

    The effect of globalization on the economic and social changes and the new paradigms of these changes which have caused vocational guidance and career counseling services are become increasingly world-wide sociopolitical instrument. To use this instrument by effectively and responsibly as individual and socially, it is required to understand…

  1. The LHCb software and computing upgrade for Run 3: opportunities and challenges

    NASA Astrophysics Data System (ADS)

    Bozzi, C.; Roiser, S.; LHCb Collaboration

    2017-10-01

    The LHCb detector will be upgraded for the LHC Run 3 and will be readout at 30 MHz, corresponding to the full inelastic collision rate, with major implications on the full software trigger and offline computing. If the current computing model and software framework are kept, the data storage capacity and computing power required to process data at this rate, and to generate and reconstruct equivalent samples of simulated events, will exceed the current capacity by at least one order of magnitude. A redesign of the software framework, including scheduling, the event model, the detector description and the conditions database, is needed to fully exploit the computing power of multi-, many-core architectures, and coprocessors. Data processing and the analysis model will also change towards an early streaming of different data types, in order to limit storage resources, with further implications for the data analysis workflows. Fast simulation options will allow to obtain a reasonable parameterization of the detector response in considerably less computing time. Finally, the upgrade of LHCb will be a good opportunity to review and implement changes in the domains of software design, test and review, and analysis workflow and preservation. In this contribution, activities and recent results in all the above areas are presented.

  2. TU-E-BRB-03: Overview of Proposed TG-132 Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, K.

    2015-06-15

    Deformable image registration (DIR) is developing rapidly and is poised to substantially improve dose fusion accuracy for adaptive and retreatment planning and motion management and PET fusion to enhance contour delineation for treatment planning. However, DIR dose warping accuracy is difficult to quantify, in general, and particularly difficult to do so on a patient-specific basis. As clinical DIR options become more widely available, there is an increased need to understand the implications of incorporating DIR into clinical workflow. Several groups have assessed DIR accuracy in clinically relevant scenarios, but no comprehensive review material is yet available. This session will alsomore » discuss aspects of the AAPM Task Group 132 on the Use of Image Registration and Data Fusion Algorithms and Techniques in Radiotherapy Treatment Planning official report, which provides recommendations for DIR clinical use. We will summarize and compare various commercial DIR software options, outline successful clinical techniques, show specific examples with discussion of appropriate and inappropriate applications of DIR, discuss the clinical implications of DIR, provide an overview of current DIR error analysis research, review QA options and research phantom development and present TG-132 recommendations. Learning Objectives: Compare/contrast commercial DIR software and QA options Overview clinical DIR workflow for retreatment To understand uncertainties introduced by DIR Review TG-132 proposed recommendations.« less

  3. Agile parallel bioinformatics workflow management using Pwrake.

    PubMed

    Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro

    2011-09-08

    In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows.

  4. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles may facilitate sharing workflows among the scientific community. Workflows for GATK and Dindel are available at http://github.com/misshie/Workflows. PMID:21899774

  5. The New Statutory Requirements in Careers Guidance in England and the Implications for Careers Provision under the Coalition Government

    ERIC Educational Resources Information Center

    Chadderton, Charlotte

    2015-01-01

    The Education Act 2011 passed responsibility for careers guidance in England from local authorities to schools, providing no extra funding or staff training. This paper reports on a project conducted in two schools in East London, which aimed to enhance careers work in response to the new requirements. It argues that whilst schools can enhance…

  6. UAS Pilot Evaluations of Suggestive Guidance on Detect-and-Avoid Displays

    NASA Technical Reports Server (NTRS)

    Monk, Kevin; Roberts, Zachary

    2016-01-01

    Minimum display requirements for Detect-and-Avoid (DAA) systems are being developed in order to support the expansion of Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The present study examines UAS pilots' subjective assessments of four DAA display configurations with varying forms of maneuver guidance. For each configuration, pilots rated the intuitiveness of the display and how well it supported their ability to perform the DAA task. Responses revealed a clear preference for the DAA displays that presented suggestive maneuver guidance in the form of "banding" compared to an Information Only display, which lacked any maneuver guidance. Implications on DAA display requirements, as well as the relation between the subjective evaluations and the objective performance data from previous studies are discussed.

  7. UAS Pilot Evaluations of Suggestive Guidance on Detect-and-Avoid Displays

    NASA Technical Reports Server (NTRS)

    Monk, Kevin J.; Roberts, Zachary

    2016-01-01

    Minimum display requirements for Detect-and-Avoid (DAA) systems are being developed in order to support the expansion of Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The present study examines UAS pilots subjective assessments of four DAA display configurations with varying forms of maneuver guidance. For each configuration, pilots rated the intuitiveness of the display and how well it supported their ability to perform the DAA task. Responses revealed a clear preference for the DAA displays that presented suggestive maneuver guidance in the form of banding compared to an Information Only display, which lacked any maneuver guidance. Implications on DAA display requirements, as well as the relation between the subjective evaluations and the objective performance data from previous studies are discussed.

  8. Interpreting guidance on prosecution for assisted dying for district nurses.

    PubMed

    Griffith, Richard

    2014-11-01

    Following a ruling by the House of Lords in 2009, the Director of Public Prosecutions issued guidance setting out the circumstances that would be likely to lead to the prosecution of a person for encouraging or assisting suicide under the Suicide Act 1961, section 2. In that guidance, a district nurse assisting a person to commit suicide would be one of the circumstances that would lead to prosecution. The Director of Public Prosecutions recently unexpectedly amended her guidance in relation to health professionals. This article discusses the implications of the amendment and argues that it will cause confusion among district nurses and give rise to an unrealistic expectation about the role a district nurse can lawfully take in assisting a person to die.

  9. Optimized small molecule antibody labeling efficiency through continuous flow centrifugal diafiltration.

    PubMed

    Cappione, Amedeo; Mabuchi, Masaharu; Briggs, David; Nadler, Timothy

    2015-04-01

    Protein immuno-detection encompasses a broad range of analytical methodologies, including western blotting, flow cytometry, and microscope-based applications. These assays which detect, quantify, and/or localize expression for one or more proteins in complex biological samples, are reliant upon fluorescent or enzyme-tagged target-specific antibodies. While small molecule labeling kits are available with a range of detection moieties, the workflow is hampered by a requirement for multiple dialysis-based buffer exchange steps that are both time-consuming and subject to sample loss. In a previous study, we briefly described an alternative method for small-scale protein labeling with small molecule dyes whereby all phases of the conjugation workflow could be performed in a single centrifugal diafiltration device. Here, we expand on this foundational work addressing functionality of the device at each step in the workflow (sample cleanup, labeling, unbound dye removal, and buffer exchange/concentration) and the implications for optimizing labeling efficiency. When compared to other common buffer exchange methodologies, centrifugal diafiltration offered superior performance as measured by four key parameters (process time, desalting capacity, protein recovery, retain functional integrity). Originally designed for resin-based affinity purification, the device also provides a platform for up-front antibody purification or albumin carrier removal. Most significantly, by exploiting the rapid kinetics of NHS-based labeling reactions, the process of continuous diafiltration minimizes reaction time and long exposure to excess dye, guaranteeing maximal target labeling while limiting the risks associated with over-labeling. Overall, the device offers a simplified workflow with reduced processing time and hands-on requirements, without sacrificing labeling efficiency, final yield, or conjugate performance. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Improved genomic resources and new bioinformatic workflow for the carcinogenic parasite Clonorchis sinensis: Biotechnological implications.

    PubMed

    Wang, Daxi; Korhonen, Pasi K; Gasser, Robin B; Young, Neil D

    Clonorchis sinensis (family Opisthorchiidae) is an important foodborne parasite that has a major socioeconomic impact on ~35 million people predominantly in China, Vietnam, Korea and the Russian Far East. In humans, infection with C. sinensis causes clonorchiasis, a complex hepatobiliary disease that can induce cholangiocarcinoma (CCA), a malignant cancer of the bile ducts. Central to understanding the epidemiology of this disease is knowledge of genetic variation within and among populations of this parasite. Although most published molecular studies seem to suggest that C. sinensis represents a single species, evidence of karyotypic variation within C. sinensis and cryptic species within a related opisthorchiid fluke (Opisthorchis viverrini) emphasise the importance of studying and comparing the genes and genomes of geographically distinct isolates of C. sinensis. Recently, we sequenced, assembled and characterised a draft nuclear genome of a C. sinensis isolate from Korea and compared it with a published draft genome of a Chinese isolate of this species using a bioinformatic workflow established for comparing draft genome assemblies and their gene annotations. We identified that 50.6% and 51.3% of the Korean and Chinese C. sinensis genomic scaffolds were syntenic, respectively. Within aligned syntenic blocks, the genomes had a high level of nucleotide identity (99.1%) and encoded 15 variable proteins likely to be involved in diverse biological processes. Here, we review current technical challenges of using draft genome assemblies to undertake comparative genomic analyses to quantify genetic variation between isolates of the same species. Using a workflow that overcomes these challenges, we report on a high-quality draft genome for C. sinensis from Korea and comparative genomic analyses, as a basis for future investigations of the genetic structures of C. sinensis populations, and discuss the biotechnological implications of these explorations. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Biomedical Informatics on the Cloud: A Treasure Hunt for Advancing Cardiovascular Medicine.

    PubMed

    Ping, Peipei; Hermjakob, Henning; Polson, Jennifer S; Benos, Panagiotis V; Wang, Wei

    2018-04-27

    In the digital age of cardiovascular medicine, the rate of biomedical discovery can be greatly accelerated by the guidance and resources required to unearth potential collections of knowledge. A unified computational platform leverages metadata to not only provide direction but also empower researchers to mine a wealth of biomedical information and forge novel mechanistic insights. This review takes the opportunity to present an overview of the cloud-based computational environment, including the functional roles of metadata, the architecture schema of indexing and search, and the practical scenarios of machine learning-supported molecular signature extraction. By introducing several established resources and state-of-the-art workflows, we share with our readers a broadly defined informatics framework to phenotype cardiovascular health and disease. © 2018 American Heart Association, Inc.

  12. Flexible workflow sharing and execution services for e-scientists

    NASA Astrophysics Data System (ADS)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.

  13. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  14. Architectures Toward Reusable Science Data Systems

    NASA Astrophysics Data System (ADS)

    Moses, J. F.

    2014-12-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building ground systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research, NOAA's weather satellites and USGS's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience the goal is to recognize architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.

  15. Agile Data Curation at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2015-12-01

    State agencies, including geological surveys, are often the gatekeepers for myriad data products essential for scientific research and economic development. For example, the Geological Survey of Alabama (GSA) is mandated to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources in support of economic development, conservation, management, and public policy for the betterment of Alabama's citizens, communities, and businesses. As part of that mandate, the GSA has increasingly been called upon to make our data more accessible to stakeholders. Even as demand for greater data accessibility grows, budgets for such efforts are often small, meaning that agencies must do more for less. Agile software development has yielded efficient, effective products, most often at lower cost and in shorter time. Taking guidance from the agile software development model, the GSA is working towards more agile data management and curation. To date, the GSA's work has been focused primarily on data rescue. By using workflows that maximize clear communication while encouraging simplicity (e.g., maximizing the amount of work not done or that can be automated), the GSA is bringing decades of dark data into the light. Regular checks by the data rescuer with the data provider (or their proxy) provides quality control without adding an overt burden on either party. Moving forward, these workflows will also allow for more efficient and effective data management.

  16. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John

    2015-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAAs Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience we expect to find architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.

  17. Total Force Restructuring Under Sequestration and Austere Budget Reductions

    DTIC Science & Technology

    2013-03-01

    jointly released guidance on January 16, 2013 that addresses near-term expenditure reductions in an attempt to mitigate future risks .9 Guidance...implication for the Army is that the 6 force structure decisions are still forthcoming and will bear much risk directly related to future levels of...their benefits and risks . The alternatives are also underpinned by assumptions that are designed to enhance their scope and not provide limitations

  18. DEWEY: the DICOM-enabled workflow engine system.

    PubMed

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  19. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    PubMed

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org.

  20. SU-F-J-110: MRI-Guided Single-Session Simulation, Online Adaptation, and Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, P; Geurts, M; Mittauer, K

    Purpose: To develop a combined simulation and treatment workflow for MRI-guided radiation therapy using the ViewRay treatment planning and delivery system. Methods: Several features of the ViewRay MRIdian planning and treatment workflows are used to simulate and treat patients that require emergent radiotherapy. A simple “pre-plan” is created on diagnostic imaging retrieved from radiology PACS, where conformal fields are created to target a volume defined by a physician based on review of the diagnostic images and chart notes. After initial consult in radiation oncology, the patient is brought to the treatment room, immobilized, and imaged in treatment position with amore » volumetric MR. While the patient rests on the table, the pre-plan is applied to the treatment planning MR and dose is calculated in the treatment geometry. After physician review, modification of the plan may include updating the target definition, redefining fields, or re-balancing beam weights. Once an acceptable treatment plan is finalized and approved, the patient is treated. Results: Careful preparation and judicious choices in the online planning process allow conformal treatment plans to be created and delivered in a single, thirty-minute session. Several advantages have been identified using this process as compared to conventional urgent CT simulation and delivery. Efficiency gains are notable, as physicians appreciate the predictable time commitment and patient waiting time for treatment is decreased. MR guidance in a treatment position offers both enhanced contrast for target delineation and reduction of setup uncertainties. The MRIdian system tools designed for adaptive radiotherapy are particularly useful, enabling plan changes to be made in minutes. Finally, the resulting plans, typically 6 conformal beams, are delivered as quickly as more conventional AP/PA beam arrangements with comparatively superior dose distributions. Conclusion: The ViewRay treatment planning software and delivery system can accommodate a fast simulation and treatment workflow.« less

  1. Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization

    PubMed Central

    Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley

    2015-01-01

    Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173

  2. TU-E-BRB-00: Deformable Image Registration: Is It Right for Your Clinic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Deformable image registration (DIR) is developing rapidly and is poised to substantially improve dose fusion accuracy for adaptive and retreatment planning and motion management and PET fusion to enhance contour delineation for treatment planning. However, DIR dose warping accuracy is difficult to quantify, in general, and particularly difficult to do so on a patient-specific basis. As clinical DIR options become more widely available, there is an increased need to understand the implications of incorporating DIR into clinical workflow. Several groups have assessed DIR accuracy in clinically relevant scenarios, but no comprehensive review material is yet available. This session will alsomore » discuss aspects of the AAPM Task Group 132 on the Use of Image Registration and Data Fusion Algorithms and Techniques in Radiotherapy Treatment Planning official report, which provides recommendations for DIR clinical use. We will summarize and compare various commercial DIR software options, outline successful clinical techniques, show specific examples with discussion of appropriate and inappropriate applications of DIR, discuss the clinical implications of DIR, provide an overview of current DIR error analysis research, review QA options and research phantom development and present TG-132 recommendations. Learning Objectives: Compare/contrast commercial DIR software and QA options Overview clinical DIR workflow for retreatment To understand uncertainties introduced by DIR Review TG-132 proposed recommendations.« less

  3. Workarounds to Intended Use of Health Information Technology: A Narrative Review of the Human Factors Engineering Literature.

    PubMed

    Patterson, Emily S

    2018-05-01

    Objective To integrate and synthesize insights from recent studies of workarounds to the intended use of health information technology (HIT) by health care professionals. Background Systems are safest when the documentation of how work is done in policies and procedures closely matches what people actually do when they are working. Proactively identifying and managing workarounds to the intended use of technology, including deviations from expected workflows, can improve system safety. Method A narrative review of studies of workarounds with HIT was conducted to identify themes in the literature. Results Three themes were identified: (1) Users circumvented new additional steps in the workflow when using HIT, (2) interdisciplinary team members communicated via HIT in text fields that were intended for other purposes, and (3) locally developed paper-based and manual whiteboard systems were used instead of HIT to support situation awareness of individuals and groups; an example of a locally developed system was handwritten notes about a patient on a piece of paper folded up and carried in a nurse's pocket. Conclusion Workarounds were employed to avoid changes to workflow, enable interdisciplinary communication, coordinate activities, and have real-time portable access to summarized and synthesized information. Application Implications for practice include providing summary overview displays, explicitly supporting role-based communication and coordination through HIT, and reducing the risk to reputation due to electronic monitoring of individual performance.

  4. Workflow management systems in radiology

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  5. New federal guidelines for physician-pharmaceutical industry relations: the politics of policy formation.

    PubMed

    Chimonas, Susan; Rothman, David J

    2005-01-01

    In October 2002 the federal government issued a draft "Compliance Program Guidance for Pharmaceutical Manufacturers." The draft Guidance questioned the legality of many arrangements heretofore left to the discretion of physicians and drug companies, including industry-funded educational and research grants, consultantcies, and gifts. Medical organizations and drug manufacturers proposed major revisions to the draft, arguing that current practices were in everyone's best interest. To evaluate the impact of their responses, we compare the draft, the changes requested by industry and organized medicine, and the final Guidance document (issued in April 2003). We also explore the implications--some intended, others unanticipated--of the final document.

  6. Multiple cytoskeletal pathways and PI3K signaling mediate CDC-42-induced neuronal protrusion in C. elegans.

    PubMed

    Alan, Jamie K; Struckhoff, Eric C; Lundquist, Erik A

    2013-01-01

    Rho GTPases are key regulators of cellular protrusion and are involved in many developmental events including axon guidance during nervous system development. Rho GTPase pathways display functional redundancy in developmental events, including axon guidance. Therefore, their roles can often be masked when using simple loss-of-function genetic approaches. As a complement to loss-of-function genetics, we constructed a constitutively activated CDC-42(G12V) expressed in C. elegans neurons. CDC-42(G12V) drove the formation of ectopic lamellipodial and filopodial protrusions in the PDE neurons, which resembled protrusions normally found on migrating growth cones of axons. We then used a candidate gene approach to identify molecules that mediate CDC-42(G12V)-induced ectopic protrusions by determining if loss of function of the genes could suppress CDC-42(G12V). Using this approach, we identified 3 cytoskeletal pathways previously implicated in axon guidance, the Arp2/3 complex, UNC-115/abLIM, and UNC-43/Ena. We also identified the Nck-interacting kinase MIG-15/NIK and p21-activated kinases (PAKs), also implicated in axon guidance. Finally, PI3K signaling was required, specifically the Rictor/mTORC2 branch but not the mTORC1 branch that has been implicated in other aspects of PI3K signaling including stress and aging. Our results indicate that multiple pathways can mediate CDC-42-induced neuronal protrusions that might be relevant to growth cone protrusions during axon pathfinding. Each of these pathways involves Rac GTPases, which might serve to integrate the pathways and coordinate the multiple CDC-42 pathways. These pathways might be relevant to developmental events such as axon pathfinding as well as disease states such as metastatic melanoma.

  7. Multiple cytoskeletal pathways and PI3K signaling mediate CDC-42-induced neuronal protrusion in C. elegans

    PubMed Central

    Alan, Jamie K; Struckhoff, Eric C; Lundquist, Erik A

    2013-01-01

    Rho GTPases are key regulators of cellular protrusion and are involved in many developmental events including axon guidance during nervous system development. Rho GTPase pathways display functional redundancy in developmental events, including axon guidance. Therefore, their roles can often be masked when using simple loss-of-function genetic approaches. As a complement to loss-of-function genetics, we constructed a constitutively activated CDC-42(G12V) expressed in C. elegans neurons. CDC-42(G12V) drove the formation of ectopic lamellipodial and filopodial protrusions in the PDE neurons, which resembled protrusions normally found on migrating growth cones of axons. We then used a candidate gene approach to identify molecules that mediate CDC-42(G12V)-induced ectopic protrusions by determining if loss of function of the genes could suppress CDC-42(G12V). Using this approach, we identified 3 cytoskeletal pathways previously implicated in axon guidance, the Arp2/3 complex, UNC-115/abLIM, and UNC-43/Ena. We also identified the Nck-interacting kinase MIG-15/NIK and p21-activated kinases (PAKs), also implicated in axon guidance. Finally, PI3K signaling was required, specifically the Rictor/mTORC2 branch but not the mTORC1 branch that has been implicated in other aspects of PI3K signaling including stress and aging. Our results indicate that multiple pathways can mediate CDC-42-induced neuronal protrusions that might be relevant to growth cone protrusions during axon pathfinding. Each of these pathways involves Rac GTPases, which might serve to integrate the pathways and coordinate the multiple CDC-42 pathways. These pathways might be relevant to developmental events such as axon pathfinding as well as disease states such as metastatic melanoma. PMID:24149939

  8. Bioinformatics workflows and web services in systems biology made easy for experimentalists.

    PubMed

    Jimenez, Rafael C; Corpas, Manuel

    2013-01-01

    Workflows are useful to perform data analysis and integration in systems biology. Workflow management systems can help users create workflows without any previous knowledge in programming and web services. However the computational skills required to build such workflows are usually above the level most biological experimentalists are comfortable with. In this chapter we introduce workflow management systems that reuse existing workflows instead of creating them, making it easier for experimentalists to perform computational tasks.

  9. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    PubMed Central

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system can be accessed either through a cloud-enabled web-interface or downloaded and installed to run within the user's local environment. All resources related to Tavaxy are available at http://www.tavaxy.org. PMID:22559942

  10. Performances of the PIPER scalable child human body model in accident reconstruction

    PubMed Central

    Giordano, Chiara; Kleiven, Svein

    2017-01-01

    Human body models (HBMs) have the potential to provide significant insights into the pediatric response to impact. This study describes a scalable/posable approach to perform child accident reconstructions using the Position and Personalize Advanced Human Body Models for Injury Prediction (PIPER) scalable child HBM of different ages and in different positions obtained by the PIPER tool. Overall, the PIPER scalable child HBM managed reasonably well to predict the injury severity and location of the children involved in real-life crash scenarios documented in the medical records. The developed methodology and workflow is essential for future work to determine child injury tolerances based on the full Child Advanced Safety Project for European Roads (CASPER) accident reconstruction database. With the workflow presented in this study, the open-source PIPER scalable HBM combined with the PIPER tool is also foreseen to have implications for improved safety designs for a better protection of children in traffic accidents. PMID:29135997

  11. Advanced medical imaging protocol workflow-a flexible electronic solution to optimize process efficiency, care quality and patient safety in the National VA Enterprise.

    PubMed

    Medverd, Jonathan R; Cross, Nathan M; Font, Frank; Casertano, Andrew

    2013-08-01

    Radiologists routinely make decisions with only limited information when assigning protocol instructions for the performance of advanced medical imaging examinations. Opportunity exists to simultaneously improve the safety, quality and efficiency of this workflow through the application of an electronic solution leveraging health system resources to provide concise, tailored information and decision support in real-time. Such a system has been developed using an open source, open standards design for use within the Veterans Health Administration. The Radiology Protocol Tool Recorder (RAPTOR) project identified key process attributes as well as inherent weaknesses of paper processes and electronic emulators of paper processes to guide the development of its optimized electronic solution. The design provides a kernel that can be expanded to create an integrated radiology environment. RAPTOR has implications relevant to the greater health care community, and serves as a case model for modernization of legacy government health information systems.

  12. Architecture of a high-performance surgical guidance system based on C-arm cone-beam CT: software platform for technical integration and clinical translation

    NASA Astrophysics Data System (ADS)

    Uneri, Ali; Schafer, Sebastian; Mirota, Daniel; Nithiananthan, Sajendra; Otake, Yoshito; Reaungamornrat, Sureerat; Yoo, Jongheun; Stayman, J. Webster; Reh, Douglas; Gallia, Gary L.; Khanna, A. Jay; Hager, Gregory; Taylor, Russell H.; Kleinszig, Gerhard; Siewerdsen, Jeffrey H.

    2011-03-01

    Intraoperative imaging modalities are becoming more prevalent in recent years, and the need for integration of these modalities with surgical guidance is rising, creating new possibilities as well as challenges. In the context of such emerging technologies and new clinical applications, a software architecture for cone-beam CT (CBCT) guided surgery has been developed with emphasis on binding open-source surgical navigation libraries and integrating intraoperative CBCT with novel, application-specific registration and guidance technologies. The architecture design is focused on accelerating translation of task-specific technical development in a wide range of applications, including orthopaedic, head-and-neck, and thoracic surgeries. The surgical guidance system is interfaced with a prototype mobile C-arm for high-quality CBCT and through a modular software architecture, integration of different tools and devices consistent with surgical workflow in each of these applications is realized. Specific modules are developed according to the surgical task, such as: 3D-3D rigid or deformable registration of preoperative images, surgical planning data, and up-to-date CBCT images; 3D-2D registration of planning and image data in real-time fluoroscopy and/or digitally reconstructed radiographs (DRRs); compatibility with infrared, electromagnetic, and video-based trackers used individually or in hybrid arrangements; augmented overlay of image and planning data in endoscopic or in-room video; real-time "virtual fluoroscopy" computed from GPU-accelerated DRRs; and multi-modality image display. The platform aims to minimize offline data processing by exposing quantitative tools that analyze and communicate factors of geometric precision. The system was translated to preclinical phantom and cadaver studies for assessment of fiducial (FRE) and target registration error (TRE) showing sub-mm accuracy in targeting and video overlay within intraoperative CBCT. The work culminates in the development of a CBCT guidance system (reported here for the first time) that leverages the technical developments in Carm CBCT and associated technologies for realizing a high-performance system for translation to clinical studies.

  13. Kwf-Grid workflow management system for Earth science applications

    NASA Astrophysics Data System (ADS)

    Tran, V.; Hluchy, L.

    2009-04-01

    In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.

  14. Real-time Fluorescence Image-Guided Oncologic Surgery

    PubMed Central

    Mondal, Suman B.; Gao, Shengkui; Zhu, Nan; Liang, Rongguang; Gruev, Viktor; Achilefu, Samuel

    2014-01-01

    Medical imaging plays a critical role in cancer diagnosis and planning. Many of these patients rely on surgical intervention for curative outcomes. This requires a careful identification of the primary and microscopic tumors, and the complete removal of cancer. Although there have been efforts to adapt traditional imaging modalities for intraoperative image guidance, they suffer from several constraints such as large hardware footprint, high operation cost, and disruption of the surgical workflow. Because of the ease of image acquisition, relatively low cost devices and intuitive operation, optical imaging methods have received tremendous interests for use in real-time image-guided surgery. To improve imaging depth under low interference by tissue autofluorescence, many of these applications utilize light in the near-infra red (NIR) wavelengths, which is invisible to human eyes. With the availability of a wide selection of tumor-avid contrast agents, advancements in imaging sensors, electronic and optical designs, surgeons are able to combine different attributes of NIR optical imaging techniques to improve treatment outcomes. The emergence of diverse commercial and experimental image guidance systems, which are in various stages of clinical translation, attests to the potential high impact of intraoperative optical imaging methods to improve speed of oncologic surgery with high accuracy and minimal margin positivity. PMID:25287689

  15. Interventional magnetic resonance imaging-guided cell transplantation into the brain with radially branched deployment.

    PubMed

    Silvestrini, Matthew T; Yin, Dali; Martin, Alastair J; Coppes, Valerie G; Mann, Preeti; Larson, Paul S; Starr, Philip A; Zeng, Xianmin; Gupta, Nalin; Panter, S S; Desai, Tejal A; Lim, Daniel A

    2015-01-01

    Intracerebral cell transplantation is being pursued as a treatment for many neurological diseases, and effective cell delivery is critical for clinical success. To facilitate intracerebral cell transplantation at the scale and complexity of the human brain, we developed a platform technology that enables radially branched deployment (RBD) of cells to multiple target locations at variable radial distances and depths along the initial brain penetration tract with real-time interventional magnetic resonance image (iMRI) guidance. iMRI-guided RBD functioned as an "add-on" to standard neurosurgical and imaging workflows, and procedures were performed in a commonly available clinical MRI scanner. Multiple deposits of super paramagnetic iron oxide beads were safely delivered to the striatum of live swine, and distribution to the entire putamen was achieved via a single cannula insertion in human cadaveric heads. Human embryonic stem cell-derived dopaminergic neurons were biocompatible with the iMRI-guided RBD platform and successfully delivered with iMRI guidance into the swine striatum. Thus, iMRI-guided RBD overcomes some of the technical limitations inherent to the use of straight cannulas and standard stereotactic targeting. This platform technology could have a major impact on the clinical translation of a wide range of cell therapeutics for the treatment of many neurological diseases.

  16. Structure of Work as Purposeful Activity

    ERIC Educational Resources Information Center

    Quey, Richard L.

    1971-01-01

    Work is purposeful activity linking the present to the future through manipulation of objects, symbols, and experience and possessing broad implications for education, vocational guidance and human work satisfaction. (Author)

  17. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copps, Kevin D.

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data managementmore » coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.« less

  18. Applying human rights to maternal health: UN Technical Guidance on rights-based approaches.

    PubMed

    Yamin, Alicia Ely

    2013-05-01

    In the last few years there have been several critical milestones in acknowledging the centrality of human rights to sustainably addressing the scourge of maternal death and morbidity around the world, including from the United Nations Human Rights Council. In 2012, the Council adopted a resolution welcoming a Technical Guidance on rights-based approaches to maternal mortality and morbidity, and calling for a report on its implementation in 2 years. The present paper provides an overview of the contents and significance of the Guidance. It reviews how the Guidance can assist policymakers in improving women's health and their enjoyment of rights by setting out the implications of adopting a human rights-based approach at each step of the policy cycle, from planning and budgeting, to ensuring implementation, to monitoring and evaluation, to fostering accountability mechanisms. The Guidance should also prove useful to clinicians in understanding rights frameworks as applied to maternal health. Copyright © 2013 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  19. Open access: changing global science publishing.

    PubMed

    Gasparyan, Armen Yuri; Ayvazyan, Lilit; Kitas, George D

    2013-08-01

    The article reflects on open access as a strategy of changing the quality of science communication globally. Successful examples of open-access journals are presented to highlight implications of archiving in open digital repositories for the quality and citability of research output. Advantages and downsides of gold, green, and hybrid models of open access operating in diverse scientific environments are described. It is assumed that open access is a global trend which influences the workflow in scholarly journals, changing their quality, credibility, and indexability.

  20. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    PubMed

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  1. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  2. MRI-only treatment planning: benefits and challenges

    NASA Astrophysics Data System (ADS)

    Owrangi, Amir M.; Greer, Peter B.; Glide-Hurst, Carri K.

    2018-03-01

    Over the past decade, the application of magnetic resonance imaging (MRI) has increased, and there is growing evidence to suggest that improvements in the accuracy of target delineation in MRI-guided radiation therapy may improve clinical outcomes in a variety of cancer types. However, some considerations should be recognized including patient motion during image acquisition and geometric accuracy of images. Moreover, MR-compatible immobilization devices need to be used when acquiring images in the treatment position while minimizing patient motion during the scan time. Finally, synthetic CT images (i.e. electron density maps) and digitally reconstructed radiograph images should be generated from MRI images for dose calculation and image guidance prior to treatment. A short review of the concepts and techniques that have been developed for implementation of MRI-only workflows in radiation therapy is provided in this document.

  3. Incidental findings found in “healthy” volunteers during imaging performed for research: current legal and ethical implications

    PubMed Central

    Booth, T C; Jackson, A; Wardlaw, J M; Taylor, S A; Waldman, A D

    2010-01-01

    Incidental findings found in “healthy” volunteers during research imaging are common and have important implications for study design and performance, particularly in the areas of informed consent, subjects' rights, clinical image analysis and disclosure. In this study, we aimed to determine current practice and regulations concerning information that should be given to research subjects when obtaining consent, reporting of research images, who should be informed about any incidental findings and the method of disclosure. We reviewed all UK, European and international humanitarian, legal and ethical agencies' guidance. We found that the guidance on what constitutes incidental pathology, how to recognise it and what to do about it is inconsistent between agencies, difficult to find and less complete in the UK than elsewhere. Where given, guidance states that volunteers should be informed during the consent process about how research images will be managed, whether a mechanism exists for identifying incidental findings, arrangements for their disclosure, the potential benefit or harm and therapeutic options. The effects of incidentally discovered pathology on the individual can be complex and far-reaching. Radiologist involvement in analysis of research images varies widely; many incidental findings might therefore go unrecognised. In conclusion, guidance on the management of research imaging is inconsistent, limited and does not address the interests of volunteers. Improved standards to guide management of research images and incidental findings are urgently required. PMID:20335427

  4. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  5. Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less

  6. Drosophila photoreceptor axon guidance and targeting requires the dreadlocks SH2/SH3 adapter protein.

    PubMed

    Garrity, P A; Rao, Y; Salecker, I; McGlade, J; Pawson, T; Zipursky, S L

    1996-05-31

    Mutations in the Drosophila gene dreadlocks (dock) disrupt photoreceptor cell (R cell) axon guidance and targeting. Genetic mosaic analysis and cell-type-specific expression of dock transgenes demonstrate dock is required in R cells for proper innervation. Dock protein contains one SH2 and three SH3 domains, implicating it in tyrosine kinase signaling, and is highly related to the human proto-oncogene Nck. Dock expression is detected in R cell growth cones in the target region. We propose Dock transmits signals in the growth cone in response to guidance and targeting cues. These findings provide an important step for dissection of signaling pathways regulating growth cone motility.

  7. Domain requirements for the Dock adapter protein in growth- cone signaling.

    PubMed

    Rao, Y; Zipursky, S L

    1998-03-03

    Tyrosine phosphorylation has been implicated in growth-cone guidance through genetic, biochemical, and pharmacological studies. Adapter proteins containing src homology 2 (SH2) domains and src homology 3 (SH3) domains provide a means of linking guidance signaling through phosphotyrosine to downstream effectors regulating growth-cone motility. The Drosophila adapter, Dreadlocks (Dock), the homolog of mammalian Nck containing three N-terminal SH3 domains and a single SH2 domain, is highly specialized for growth-cone guidance. In this paper, we demonstrate that Dock can couple signals in either an SH2-dependent or an SH2-independent fashion in photoreceptor (R cell) growth cones, and that Dock displays different domain requirements in different neurons.

  8. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    PubMed Central

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  9. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  10. GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction

    NASA Astrophysics Data System (ADS)

    Chrysanthis, Panos K.

    1996-12-01

    Computer Science Department, University of Pittsburgh, Pittsburgh, PA 15260, USA This special issue focuses on current efforts to represent and support workflows that integrate information systems and human resources within a business or manufacturing enterprise. Workflows may also be viewed as an emerging computational paradigm for effective structuring of cooperative applications involving human users and access to diverse data types not necessarily maintained by traditional database management systems. A workflow is an automated organizational process (also called business process) which consists of a set of activities or tasks that need to be executed in a particular controlled order over a combination of heterogeneous database systems and legacy systems. Within workflows, tasks are performed cooperatively by either human or computational agents in accordance with their roles in the organizational hierarchy. The challenge in facilitating the implementation of workflows lies in developing efficient workflow management systems. A workflow management system (also called workflow server, workflow engine or workflow enactment system) provides the necessary interfaces for coordination and communication among human and computational agents to execute the tasks involved in a workflow and controls the execution orderings of tasks as well as the flow of data that these tasks manipulate. That is, the workflow management system is responsible for correctly and reliably supporting the specification, execution, and monitoring of workflows. The six papers selected (out of the twenty-seven submitted for this special issue of Distributed Systems Engineering) address different aspects of these three functional components of a workflow management system. In the first paper, `Correctness issues in workflow management', Kamath and Ramamritham discuss the important issue of correctness in workflow management that constitutes a prerequisite for the use of workflows in the automation of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity. The fourth and fifth papers are experience papers describing a workflow management system and a large scale workflow application, respectively. Schill and Mittasch, in `Workflow management systems on top of OSF DCE and OMG CORBA', describe a decentralized workflow management system and discuss its implementation using two standardized middleware platforms, namely, OSF DCE and OMG CORBA. The system supports a new approach to workflow management, introducing several new concepts such as data type management for integrating various types of data and quality of service for various services provided by servers. A problem common to both database applications and workflows is the handling of missing and incomplete information. This is particularly pervasive in an `electronic market' with a huge number of retail outlets producing and exchanging volumes of data, the application discussed in `Information flow in the DAMA project beyond database managers: information flow managers'. Motivated by the need for a method that allows a task to proceed in a timely manner if not all data produced by other tasks are available by its deadline, Russell et al propose an architectural framework and a language that can be used to detect, approximate and, later on, to adjust missing data if necessary. The final paper, `The evolution towards flexible workflow systems' by Nutt, is complementary to the other papers and is a survey of issues and of work related to both workflow and computer supported collaborative work (CSCW) areas. In particular, the paper provides a model and a categorization of the dimensions which workflow management and CSCW systems share. Besides summarizing the recent advancements towards efficient workflow management, the papers in this special issue suggest areas open to investigation and it is our hope that they will also provide the stimulus for further research and development in the area of workflow management systems.

  11. Handling Metadata in a Neurophysiology Laboratory

    PubMed Central

    Zehl, Lyuba; Jaillet, Florent; Stoewer, Adrian; Grewe, Jan; Sobolev, Andrey; Wachtler, Thomas; Brochier, Thomas G.; Riehle, Alexa; Denker, Michael; Grün, Sonja

    2016-01-01

    To date, non-reproducibility of neurophysiological research is a matter of intense discussion in the scientific community. A crucial component to enhance reproducibility is to comprehensively collect and store metadata, that is, all information about the experiment, the data, and the applied preprocessing steps on the data, such that they can be accessed and shared in a consistent and simple manner. However, the complexity of experiments, the highly specialized analysis workflows and a lack of knowledge on how to make use of supporting software tools often overburden researchers to perform such a detailed documentation. For this reason, the collected metadata are often incomplete, incomprehensible for outsiders or ambiguous. Based on our research experience in dealing with diverse datasets, we here provide conceptual and technical guidance to overcome the challenges associated with the collection, organization, and storage of metadata in a neurophysiology laboratory. Through the concrete example of managing the metadata of a complex experiment that yields multi-channel recordings from monkeys performing a behavioral motor task, we practically demonstrate the implementation of these approaches and solutions with the intention that they may be generalized to other projects. Moreover, we detail five use cases that demonstrate the resulting benefits of constructing a well-organized metadata collection when processing or analyzing the recorded data, in particular when these are shared between laboratories in a modern scientific collaboration. Finally, we suggest an adaptable workflow to accumulate, structure and store metadata from different sources using, by way of example, the odML metadata framework. PMID:27486397

  12. Handling Metadata in a Neurophysiology Laboratory.

    PubMed

    Zehl, Lyuba; Jaillet, Florent; Stoewer, Adrian; Grewe, Jan; Sobolev, Andrey; Wachtler, Thomas; Brochier, Thomas G; Riehle, Alexa; Denker, Michael; Grün, Sonja

    2016-01-01

    To date, non-reproducibility of neurophysiological research is a matter of intense discussion in the scientific community. A crucial component to enhance reproducibility is to comprehensively collect and store metadata, that is, all information about the experiment, the data, and the applied preprocessing steps on the data, such that they can be accessed and shared in a consistent and simple manner. However, the complexity of experiments, the highly specialized analysis workflows and a lack of knowledge on how to make use of supporting software tools often overburden researchers to perform such a detailed documentation. For this reason, the collected metadata are often incomplete, incomprehensible for outsiders or ambiguous. Based on our research experience in dealing with diverse datasets, we here provide conceptual and technical guidance to overcome the challenges associated with the collection, organization, and storage of metadata in a neurophysiology laboratory. Through the concrete example of managing the metadata of a complex experiment that yields multi-channel recordings from monkeys performing a behavioral motor task, we practically demonstrate the implementation of these approaches and solutions with the intention that they may be generalized to other projects. Moreover, we detail five use cases that demonstrate the resulting benefits of constructing a well-organized metadata collection when processing or analyzing the recorded data, in particular when these are shared between laboratories in a modern scientific collaboration. Finally, we suggest an adaptable workflow to accumulate, structure and store metadata from different sources using, by way of example, the odML metadata framework.

  13. MRI-guided procedures in various regions of the body using a robotic assistance system in a closed-bore scanner: preliminary clinical experience and limitations.

    PubMed

    Moche, Michael; Zajonz, Dirk; Kahn, Thomas; Busse, Harald

    2010-04-01

    To present the clinical setup and workflow of a robotic assistance system for image-guided interventions in a conventional magnetic resonance imaging (MRI) environment and to report our preliminary clinical experience with percutaneous biopsies in various body regions. The MR-compatible, servo-pneumatically driven, robotic device (Innomotion) fits into the 60-cm bore of a standard MR scanner. The needle placement (n = 25) accuracy was estimated by measuring the 3D deviation between needle tip and prescribed target point in a phantom. Percutaneous biopsies in six patients and different body regions were planned by graphically selecting entry and target points on intraoperatively acquired roadmap MR data. For insertion depths between 29 and 95 mm, the average 3D needle deviation was 2.2 +/- 0.7 mm (range 0.9-3.8 mm). Patients with a body mass index of up to approximately 30 kg/m(2) fitted into the bore with the device. Clinical work steps and limitations are reported for the various applications. All biopsies were diagnostic and could be completed without any major complications. Median planning and intervention times were 25 (range 20-36) and 44 (36-68) minutes, respectively. Preliminary clinical results in a standard MRI environment suggest that the presented robotic device provides accurate guidance for percutaneous procedures in various body regions. Shorter procedure times may be achievable by optimizing technical and workflow aspects. (c) 2010 Wiley-Liss, Inc.

  14. Classical workflow nets and workflow nets with reset arcs: using Lyapunov stability for soundness verification

    NASA Astrophysics Data System (ADS)

    Clempner, Julio B.

    2017-01-01

    This paper presents a novel analytical method for soundness verification of workflow nets and reset workflow nets, using the well-known stability results of Lyapunov for Petri nets. We also prove that the soundness property is decidable for workflow nets and reset workflow nets. In addition, we provide evidence of several outcomes related with properties such as boundedness, liveness, reversibility and blocking using stability. Our approach is validated theoretically and by a numerical example related to traffic signal-control synchronisation.

  15. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  16. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  17. Neuronal clues to vascular guidance.

    PubMed

    Suchting, Steven; Bicknell, Roy; Eichmann, Anne

    2006-03-10

    The development of the vertebrate vascular system into a highly ordered and stereotyped network requires precise control over the branching and growth of new vessels. Recent research has highlighted the important role of genetic programs in regulating vascular patterning and in particular has established a crucial role for families of molecules previously described in controlling neuronal guidance. Like neurons, new vessels are guided along the correct path by integrating attractive and repulsive cues from the external environment. This is achieved by specialised endothelial cells at the leading tip of vessel sprouts which express receptor proteins that couple extracellular guidance signals with the cytoskeletal changes necessary to alter cell direction. Here, we review the genetic and in vitro evidence implicating four families of ligand-receptor signalling systems common to both neuronal and vessel guidance: the Ephrins and Eph receptors; Semaphorins, Neuropilins and Plexin receptors; Netrin and Unc5 receptors; and Slits and Robo receptors.

  18. Virtual Sensors in a Web 2.0 Digital Watershed

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D. J.; Marini, L.; Kooper, R.; Rodriguez, A.; Myers, J. D.

    2008-12-01

    The lack of rainfall data in many watersheds is one of the major barriers for modeling and studying many environmental and hydrological processes and supporting decision making. There are just not enough rain gages on the ground. To overcome this data scarcity issue, a Web 2.0 digital watershed is developed at NCSA(National Center for Supercomputing Applications), where users can point-and-click on a web-based google map interface and create new precipitation virtual sensors at any location within the same coverage region as a NEXRAD station. A set of scientific workflows are implemented to perform spatial, temporal and thematic transformations to the near-real-time NEXRAD Level II data. Such workflows can be triggered by the users' actions and generate either rainfall rate or rainfall accumulation streaming data at a user-specified time interval. We will discuss some underlying components of this digital watershed, which consists of a semantic content management middleware, a semantically enhanced streaming data toolkit, virtual sensor management functionality, and RESTful (REpresentational State Transfer) web service that can trigger the workflow execution. Such loosely coupled architecture presents a generic framework for constructing a Web 2.0 style digital watershed. An implementation of this architecture at the Upper Illinois Rive Basin will be presented. We will also discuss the implications of the virtual sensor concept for the broad environmental observatory community and how such concept will help us move towards a participatory digital watershed.

  19. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow.

    PubMed

    Walsh, Kristin E; Chui, Michelle Anne; Kieser, Mara A; Williams, Staci M; Sutter, Susan L; Sutter, John G

    2011-01-01

    To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign.

  20. Generic worklist handler for workflow-enabled products

    NASA Astrophysics Data System (ADS)

    Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas

    1999-07-01

    Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.

  1. The Genetics of Axon Guidance and Axon Regeneration in Caenorhabditis elegans

    PubMed Central

    Chisholm, Andrew D.; Hutter, Harald; Jin, Yishi; Wadsworth, William G.

    2016-01-01

    The correct wiring of neuronal circuits depends on outgrowth and guidance of neuronal processes during development. In the past two decades, great progress has been made in understanding the molecular basis of axon outgrowth and guidance. Genetic analysis in Caenorhabditis elegans has played a key role in elucidating conserved pathways regulating axon guidance, including Netrin signaling, the slit Slit/Robo pathway, Wnt signaling, and others. Axon guidance factors were first identified by screens for mutations affecting animal behavior, and by direct visual screens for axon guidance defects. Genetic analysis of these pathways has revealed the complex and combinatorial nature of guidance cues, and has delineated how cues guide growth cones via receptor activity and cytoskeletal rearrangement. Several axon guidance pathways also affect directed migrations of non-neuronal cells in C. elegans, with implications for normal and pathological cell migrations in situations such as tumor metastasis. The small number of neurons and highly stereotyped axonal architecture of the C. elegans nervous system allow analysis of axon guidance at the level of single identified axons, and permit in vivo tests of prevailing models of axon guidance. C. elegans axons also have a robust capacity to undergo regenerative regrowth after precise laser injury (axotomy). Although such axon regrowth shares some similarities with developmental axon outgrowth, screens for regrowth mutants have revealed regeneration-specific pathways and factors that were not identified in developmental screens. Several areas remain poorly understood, including how major axon tracts are formed in the embryo, and the function of axon regeneration in the natural environment. PMID:28114100

  2. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    PubMed

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  3. Examining the implementation of NICE guidance: cross-sectional survey of the use of NICE interventional procedures guidance by NHS Trusts.

    PubMed

    Lowson, Karin; Jenks, Michelle; Filby, Alexandra; Carr, Louise; Campbell, Bruce; Powell, John

    2015-06-30

    In the UK, NHS hospitals receive large amounts of evidence-based recommendations for care delivery from the National Institute for Health and Care Excellence (NICE) and other organisations. Little is known about how NHS organisations implement such guidance and best practice for doing so. This study was therefore designed to examine the dissemination, decision-making, and monitoring processes for NICE interventional procedures (IP) guidance and to investigate the barriers and enablers to the implementation of such guidance. A cross-sectional survey questionnaire was developed and distributed to individuals responsible for managing the processes around NICE guidance in all 181 acute NHS hospitals in England, Scotland, Wales and Northern Ireland. A review of acute NHS hospital policies for implementing NICE guidance was also undertaken using information available in the public domain and from organisations' websites. The response rate to the survey was 75 % with 135 completed surveys received. Additionally, policies from 25 % of acute NHS hospitals were identified and analysed. NHS acute hospitals typically had detailed processes in place to implement NICE guidance, although organisations recognised barriers to implementation including organisational process barriers, clinical engagement and poor targeting with a large number of guidance issued. Examples of enablers to, and good practice for, implementation of guidance were found, most notably the value of shared learning experiences between NHS hospitals. Implications for NICE were also identified. These included making improvements to the layout of guidance, signposting on the website and making better use of their shared learning platform. Most organisations have robust processes in place to deal with implementing guidance. However, resource limitations and the scope of guidance received by organisations create barriers relating to organisational processes, clinician engagement and financing of new procedures. Guidance implementation can be facilitated through encouragement of shared learning by organisations such as NICE and open knowledge transfer between organisations.

  4. Comparison and assessment of semi-automatic image segmentation in computed tomography scans for image-guided kidney surgery.

    PubMed

    Glisson, Courtenay L; Altamar, Hernan O; Herrell, S Duke; Clark, Peter; Galloway, Robert L

    2011-11-01

    Image segmentation is integral to implementing intraoperative guidance for kidney tumor resection. Results seen in computed tomography (CT) data are affected by target organ physiology as well as by the segmentation algorithm used. This work studies variables involved in using level set methods found in the Insight Toolkit to segment kidneys from CT scans and applies the results to an image guidance setting. A composite algorithm drawing on the strengths of multiple level set approaches was built using the Insight Toolkit. This algorithm requires image contrast state and seed points to be identified as input, and functions independently thereafter, selecting and altering method and variable choice as needed. Semi-automatic results were compared to expert hand segmentation results directly and by the use of the resultant surfaces for registration of intraoperative data. Direct comparison using the Dice metric showed average agreement of 0.93 between semi-automatic and hand segmentation results. Use of the segmented surfaces in closest point registration of intraoperative laser range scan data yielded average closest point distances of approximately 1 mm. Application of both inverse registration transforms from the previous step to all hand segmented image space points revealed that the distance variability introduced by registering to the semi-automatically segmented surface versus the hand segmented surface was typically less than 3 mm both near the tumor target and at distal points, including subsurface points. Use of the algorithm shortened user interaction time and provided results which were comparable to the gold standard of hand segmentation. Further, the use of the algorithm's resultant surfaces in image registration provided comparable transformations to surfaces produced by hand segmentation. These data support the applicability and utility of such an algorithm as part of an image guidance workflow.

  5. Point-of-Care Autofluorescence Imaging for Real-Time Sampling and Treatment Guidance of Bioburden in Chronic Wounds: First-in-Human Results

    PubMed Central

    DaCosta, Ralph S.; Kulbatski, Iris; Lindvere-Teene, Liis; Starr, Danielle; Blackmore, Kristina; Silver, Jason I.; Opoku, Julie; Wu, Yichao Charlie; Medeiros, Philip J.; Xu, Wei; Xu, Lizhen; Wilson, Brian C.; Rosen, Cheryl; Linden, Ron

    2015-01-01

    Background Traditionally, chronic wound infection is diagnosed by visual inspection under white light and microbiological sampling, which are subjective and suboptimal, respectively, thereby delaying diagnosis and treatment. To address this, we developed a novel handheld, fluorescence imaging device (PRODIGI) that enables non-contact, real-time, high-resolution visualization and differentiation of key pathogenic bacteria through their endogenous autofluorescence, as well as connective tissues in wounds. Methods and Findings This was a two-part Phase I, single center, non-randomized trial of chronic wound patients (male and female, ≥18 years; UHN REB #09-0015-A for part 1; UHN REB #12-5003 for part 2; clinicaltrials.gov Identifier: NCT01378728 for part 1 and NCT01651845 for part 2). Part 1 (28 patients; 54% diabetic foot ulcers, 46% non-diabetic wounds) established the feasibility of autofluorescence imaging to accurately guide wound sampling, validated against blinded, gold standard swab-based microbiology. Part 2 (12 patients; 83.3% diabetic foot ulcers, 16.7% non-diabetic wounds) established the feasibility of autofluorescence imaging to guide wound treatment and quantitatively assess treatment response. We showed that PRODIGI can be used to guide and improve microbiological sampling and debridement of wounds in situ, enabling diagnosis, treatment guidance and response assessment in patients with chronic wounds. PRODIGI is safe, easy to use and integrates into the clinical workflow. Clinically significant bacterial burden can be detected in seconds, quantitatively tracked over days-to-months and their biodistribution mapped within the wound bed, periphery, and other remote areas. Conclusions PRODIGI represents a technological advancement in wound sampling and treatment guidance for clinical wound care at the point-of-care. Trial Registration ClinicalTrials.gov NCT01651845; ClinicalTrials.gov NCT01378728 PMID:25790480

  6. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE PAGES

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...

    2016-07-21

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  7. Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, Valerie; Fox, James; Ghoshal, Devarshi

    The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less

  8. Standardizing clinical trials workflow representation in UML for international site comparison.

    PubMed

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-11-09

    With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.

  9. Standardizing Clinical Trials Workflow Representation in UML for International Site Comparison

    PubMed Central

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M. O.; Rodrigues, Maria J.; Shah, Jatin; Loures, Marco R.; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-01-01

    Background With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Methods Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Results Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. Conclusions This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows. PMID:21085484

  10. An intraorganizational model for developing and spreading quality improvement innovations

    PubMed Central

    Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.

    2017-01-01

    Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788

  11. Consideration of health inequalities in systematic reviews: a mapping review of guidance.

    PubMed

    Maden, Michelle

    2016-11-28

    Given that we know that interventions shown to be effective in improving the health of a population may actually widen the health inequalities gap while others reduce it, it is imperative that all systematic reviewers consider how the findings of their reviews may impact (reduce or increase) on the health inequality gap. This study reviewed existing guidance on incorporating considerations of health inequalities in systematic reviews in order to examine the extent to which they can help reviewers to incorporate such issues. A mapping review was undertaken to identify guidance documents that purported to inform reviewers on whether and how to incorporate considerations of health inequalities. Searches were undertaken in Medline, CINAHL and The Cochrane Library Methodology Register. Review guidance manuals prepared by international organisations engaged in undertaking systematic reviews, and their associated websites were scanned. Studies were included if they provided an overview or discussed the development and testing of guidance for dealing with the incorporation of considerations of health inequalities in evidence synthesis. Results are summarised in narrative and tabular forms. Twenty guidance documents published between 2009 and 2016 were included. Guidance has been produced to inform considerations of health inequalities at different stages of the systematic review process. The Campbell and Cochrane Equity Group have been instrumental in developing and promoting such guidance. Definitions of health inequalities and guidance differed across the included studies. All but one guidance document were transparent in their method of production. Formal methods of evaluation were reported for six guidance documents. Most of the guidance was operationalised in the form of examples taken from published systematic reviews. The number of guidance items to operationalise ranges from 3 up to 26 with a considerable overlap noted. Adhering to the guidance will require more work for the reviewers. It requires a deeper understanding of how reviewers can operationalise the guidance taking into consideration the barriers and facilitators involved. This has implications not only for understanding the usefulness and burden of the guidance but also for the uptake of guidance and its ultimate goal of improving health inequalities considerations in systematic reviews.

  12. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  13. Pathway concepts experiment for head-down synthetic vision displays

    NASA Astrophysics Data System (ADS)

    Prinzel, Lawrence J., III; Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2004-08-01

    Eight 757 commercial airline captains flew 22 approaches using the Reno Sparks 16R Visual Arrival under simulated Category I conditions. Approaches were flown using a head-down synthetic vision display to evaluate four tunnel ("minimal", "box", "dynamic pathway", "dynamic crow's feet") and three guidance ("ball", "tadpole", "follow-me aircraft") concepts and compare their efficacy to a baseline condition (i.e., no tunnel, ball guidance). The results showed that the tunnel concepts significantly improved pilot performance and situation awareness and lowered workload compared to the baseline condition. The dynamic crow's feet tunnel and follow-me aircraft guidance concepts were found to be the best candidates for future synthetic vision head-down displays. These results are discussed with implications for synthetic vision display design and future research.

  14. Domain requirements for the Dock adapter protein in growth- cone signaling

    PubMed Central

    Rao, Yong; Zipursky, S. Lawrence

    1998-01-01

    Tyrosine phosphorylation has been implicated in growth-cone guidance through genetic, biochemical, and pharmacological studies. Adapter proteins containing src homology 2 (SH2) domains and src homology 3 (SH3) domains provide a means of linking guidance signaling through phosphotyrosine to downstream effectors regulating growth-cone motility. The Drosophila adapter, Dreadlocks (Dock), the homolog of mammalian Nck containing three N-terminal SH3 domains and a single SH2 domain, is highly specialized for growth-cone guidance. In this paper, we demonstrate that Dock can couple signals in either an SH2-dependent or an SH2-independent fashion in photoreceptor (R cell) growth cones, and that Dock displays different domain requirements in different neurons. PMID:9482841

  15. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    PubMed Central

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  16. Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.

  17. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow

    PubMed Central

    Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.

    2012-01-01

    Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459

  18. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    NASA Astrophysics Data System (ADS)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  19. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    NASA Astrophysics Data System (ADS)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  20. An Internal Standard for Assessing Phosphopeptide Recovery from Metal Ion/Oxide Enrichment Strategies

    NASA Astrophysics Data System (ADS)

    Paulo, Joao A.; Navarrete-Perea, Jose; Erickson, Alison R.; Knott, Jeffrey; Gygi, Steven P.

    2018-04-01

    Phosphorylation-mediated signaling pathways have major implications in cellular regulation and disease. However, proteins with roles in these pathways are frequently less abundant and phosphorylation is often sub-stoichiometric. As such, the efficient enrichment, and subsequent recovery of phosphorylated peptides, is vital. Mass spectrometry-based proteomics is a well-established approach for quantifying thousands of phosphorylation events in a single experiment. We designed a peptide internal standard-based assay directed toward sample preparation strategies for mass spectrometry analysis to understand better phosphopeptide recovery from enrichment strategies. We coupled mass-differential tandem mass tag (mTMT) reagents (specifically, TMTzero and TMTsuper-heavy), nine mass spectrometry-amenable phosphopeptides (phos9), and peak area measurements from extracted ion chromatograms to determine phosphopeptide recovery. We showcase this mTMT/phos9 recovery assay by evaluating three phosphopeptide enrichment workflows. Our assay provides data on the recovery of phosphopeptides, which complement other metrics, namely the number of identified phosphopeptides and enrichment specificity. Our mTMT/phos9 assay is applicable to any enrichment protocol in a typical experimental workflow irrespective of sample origin or labeling strategy. [Figure not available: see fulltext.

  1. Space: the final frontier in the learning of science?

    NASA Astrophysics Data System (ADS)

    Milne, Catherine

    2014-03-01

    In Space, relations, and the learning of science, Wolff-Michael Roth and Pei-Ling Hsu use ethnomethodology to explore high school interns learning shopwork and shoptalk in a research lab that is located in a world class facility for water quality analysis. Using interaction analysis they identify how spaces, like a research laboratory, can be structured as smart spaces to create a workflow (learning flow) so that shoptalk and shopwork can projectively organize the actions of interns even in new and unfamiliar settings. Using these findings they explore implications for the design of curriculum and learning spaces more broadly. The Forum papers of Erica Blatt and Cassie Quigley complement this analysis. Blatt expands the discussion on space as an active component of learning with an examination of teaching settings, beyond laboratory spaces, as active participants of education. Quigley examines smart spaces as authentic learning spaces while acknowledging how internship experiences all empirical elements of authentic learning including open-ended inquiry and empowerment. In this paper I synthesize these ideas and propose that a narrative structure might better support workflow, student agency and democratic decision making.

  2. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  3. Information Issues and Contexts that Impair Team Based Communication Workflow: A Palliative Sedation Case Study.

    PubMed

    Cornett, Alex; Kuziemsky, Craig

    2015-01-01

    Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.

  4. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    PubMed

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P < .0001), with higher sensitivity (85% vs. 69%) and specificity (94% vs. 73%) for margin prediction in a 5 × 5-mm grid analysis. A significantly better alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  5. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    PubMed Central

    Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R

    2007-01-01

    Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870

  6. FAST: A fully asynchronous and status-tracking pattern for geoprocessing services orchestration

    NASA Astrophysics Data System (ADS)

    Wu, Huayi; You, Lan; Gui, Zhipeng; Gao, Shuang; Li, Zhenqiang; Yu, Jingmin

    2014-09-01

    Geoprocessing service orchestration (GSO) provides a unified and flexible way to implement cross-application, long-lived, and multi-step geoprocessing service workflows by coordinating geoprocessing services collaboratively. Usually, geoprocessing services and geoprocessing service workflows are data and/or computing intensive. The intensity feature may make the execution process of a workflow time-consuming. Since it initials an execution request without blocking other interactions on the client side, an asynchronous mechanism is especially appropriate for GSO workflows. Many critical problems remain to be solved in existing asynchronous patterns for GSO including difficulties in improving performance, status tracking, and clarifying the workflow structure. These problems are a challenge when orchestrating performance efficiency, making statuses instantly available, and constructing clearly structured GSO workflows. A Fully Asynchronous and Status-Tracking (FAST) pattern that adopts asynchronous interactions throughout the whole communication tier of a workflow is proposed for GSO. The proposed FAST pattern includes a mechanism that actively pushes the latest status to clients instantly and economically. An independent proxy was designed to isolate the status tracking logic from the geoprocessing business logic, which assists the formation of a clear GSO workflow structure. A workflow was implemented in the FAST pattern to simulate the flooding process in the Poyang Lake region. Experimental results show that the proposed FAST pattern can efficiently tackle data/computing intensive geoprocessing tasks. The performance of all collaborative partners was improved due to the asynchronous mechanism throughout communication tier. A status-tracking mechanism helps users retrieve the latest running status of a GSO workflow in an efficient and instant way. The clear structure of the GSO workflow lowers the barriers for geospatial domain experts and model designers to compose asynchronous GSO workflows. Most importantly, it provides better support for locating and diagnosing potential exceptions.

  7. Conceptual-level workflow modeling of scientific experiments using NMR as a case study.

    PubMed

    Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R

    2007-01-30

    Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.

  8. How are service users instructed to measure home furniture for provision of minor assistive devices?

    PubMed

    Atwal, Anita; Mcintyre, Anne; Spiliotopoulou, Georgia; Money, Arthur; Paraskevopulos, Ioannis

    2017-02-01

    Measurements play a vital role in providing devices that meet the individual needs of users. There is increasing evidence of devices being abandoned. The reasons for this are complex but one key factor that plays a role in non-use of equipment is the lack of fit between the device, environment and person. In addition, the abandonment of devices can be seen as a waste of public money. The aim of this paper is to examine the type, the readability, and the content of existing guidance in relation to measuring home furniture. An online national survey involving health and social care trusts in the UK. We conducted a synthesis of leaflets associated with measurement of furniture to identify existing guidance. The content and readability of this guidance was then evaluated. From the 325 responses received, 64 therapists reported using guidance. From the 13 leaflets that were analysed, 8 leaflets were found to meet Level 3 Adult Literacy Standards (age 9-11). There were differences in the way in which the measurement of furniture items occurred within the leaflets with no measurement guidance reported for baths. There is a need to standardize guidance to ensure that measurements are reliable. Implications for Rehabilitation Our research has highlighted the need to confirm and agree measurement techniques for home furniture in the provision of assistive devices. Inaccurate guidance can lead to abandonment of devices. Inaccurate guidance could prevent service users from not participating within the self-assessment process for devices.

  9. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS

    PubMed Central

    Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2016-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971

  10. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    PubMed

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  11. The future of scientific workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less

  12. Development of a user customizable imaging informatics-based intelligent workflow engine system to enhance rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent

    2014-03-01

    Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.

  13. New portable voice guidance device for the manual wheelchair transfer: a pilot study in patients with hemiplegia.

    PubMed

    Yoshida, Taiki; Otaka, Yohei; Osu, Rieko; Kita, Kahori; Sakata, Sachiko; Kondo, Kunitsugu

    2017-05-01

    Older and/or cognitively impaired patients require verbal guidance to prevent accidents during wheelchair operation, thus increasing the burden on caregivers. This study aimed to develop a new portable voice guidance device for manual wheelchairs and examine its clinical usefulness. We developed a portable voice guidance device to monitor the statuses of wheelchair brakes and footrests and automatically provide voice guidance for operation. The device comprises a microcomputer, four magnets and magnetic sensors, speaker and battery. Device operation was assessed during the transfer from a wheelchair to bed six times per day over three days for a total of 90 transfers in five stroke patients (mean age: 79.6 years) who required verbal guidance to direct wheelchair operation. Device usability was also assessed using a questionnaire. The device performed perfectly during all attempted transfers (100%). To ensure safety, the assessor needed to add verbal guidance during 33 of 90 attempted transfers (36.6%). Overall, the device usability was favourable. However, some assessors were unsatisfied with the volume of the device voice, guidance timing and burden reduction. Our device could facilitate wheelchair operation and might potentially be used to reduce fall risk in stroke patients and the burden on caregivers. Implications for Rehabilitation The acquisition of transfer independence is an important step in the rehabilitation of patients with mobility issues. Many patients require supervision and guidance regarding the operation of brakes and footrests on manual wheelchairs. This newly developed voice guidance device for manual wheelchair transfers worked well in patients with hemiplegia and might be helpful to reduce the fall risks and the burden of care.

  14. The standard-based open workflow system in GeoBrain (Invited)

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial intelligence technology. The GeoBrain workflow system has been used in multiple Earth science applications, including the monitoring of global agricultural drought, the assessment of flood damage, the derivation of national crop condition and progress information, and the detection of nuclear proliferation facilities and events.

  15. Pre-marital and Marital Counselling: Implications for the School Guidance Counsellor

    ERIC Educational Resources Information Center

    Schlesinger, Benjamin

    1978-01-01

    One of the foremost tasks of young people contemplating marriage is the discovery of their basic selfhood and their continued growth as people; this is the first goal in pre-marital counseliling. (Author)

  16. 77 FR 9946 - Draft Guidance for Industry on Drug Interaction Studies-Study Design, Data Analysis, Implications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-21

    ... and magnitude of drug-drug interactions for several reasons. Concomitant medications, dietary supplements, and some foods, such as grapefruit juice, may alter metabolism and/or drug transport abruptly in...

  17. Augmented reality-assisted bypass surgery: embracing minimal invasiveness.

    PubMed

    Cabrilo, Ivan; Schaller, Karl; Bijlenga, Philippe

    2015-04-01

    The overlay of virtual images on the surgical field, defined as augmented reality, has been used for image guidance during various neurosurgical procedures. Although this technology could conceivably address certain inherent problems of extracranial-to-intracranial bypass procedures, this potential has not been explored to date. We evaluate the usefulness of an augmented reality-based setup, which could help in harvesting donor vessels through their precise localization in real-time, in performing tailored craniotomies, and in identifying preoperatively selected recipient vessels for the purpose of anastomosis. Our method was applied to 3 patients with Moya-Moya disease who underwent superficial temporal artery-to-middle cerebral artery anastomoses and 1 patient who underwent an occipital artery-to-posteroinferior cerebellar artery bypass because of a dissecting aneurysm of the vertebral artery. Patients' heads, skulls, and extracranial and intracranial vessels were segmented preoperatively from 3-dimensional image data sets (3-dimensional digital subtraction angiography, angio-magnetic resonance imaging, angio-computed tomography), and injected intraoperatively into the operating microscope's eyepiece for image guidance. In each case, the described setup helped in precisely localizing donor and recipient vessels and in tailoring craniotomies to the injected images. The presented system based on augmented reality can optimize the workflow of extracranial-to-intracranial bypass procedures by providing essential anatomical information, entirely integrated to the surgical field, and help to perform minimally invasive procedures. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Accuracy assessment of fluoroscopy-transesophageal echocardiography registration

    NASA Astrophysics Data System (ADS)

    Lang, Pencilla; Seslija, Petar; Bainbridge, Daniel; Guiraudon, Gerard M.; Jones, Doug L.; Chu, Michael W.; Holdsworth, David W.; Peters, Terry M.

    2011-03-01

    This study assesses the accuracy of a new transesophageal (TEE) ultrasound (US) fluoroscopy registration technique designed to guide percutaneous aortic valve replacement. In this minimally invasive procedure, a valve is inserted into the aortic annulus via a catheter. Navigation and positioning of the valve is guided primarily by intra-operative fluoroscopy. Poor anatomical visualization of the aortic root region can result in incorrect positioning, leading to heart valve embolization, obstruction of the coronary ostia and acute kidney injury. The use of TEE US images to augment intra-operative fluoroscopy provides significant improvements to image-guidance. Registration is achieved using an image-based TEE probe tracking technique and US calibration. TEE probe tracking is accomplished using a single-perspective pose estimation algorithm. Pose estimation from a single image allows registration to be achieved using only images collected in standard OR workflow. Accuracy of this registration technique is assessed using three models: a point target phantom, a cadaveric porcine heart with implanted fiducials, and in-vivo porcine images. Results demonstrate that registration can be achieved with an RMS error of less than 1.5mm, which is within the clinical accuracy requirements of 5mm. US-fluoroscopy registration based on single-perspective pose estimation demonstrates promise as a method for providing guidance to percutaneous aortic valve replacement procedures. Future work will focus on real-time implementation and a visualization system that can be used in the operating room.

  19. An attempt to calculate in silico disintegration time of tablets containing mefenamic acid, a low water-soluble drug.

    PubMed

    Kimura, Go; Puchkov, Maxim; Leuenberger, Hans

    2013-07-01

    Based on a Quality by Design (QbD) approach, it is important to follow International Conference on Harmonization (ICH) guidance Q8 (R2) recommendations to explore the design space. The application of an experimental design is, however, not sufficient because of the fact that it is necessary to take into account the effects of percolation theory. For this purpose, an adequate software needs to be applied, capable of detecting percolation thresholds as a function of the distribution of the functional powder particles. Formulation-computer aided design (F-CAD), originally designed to calculate in silico the drug dissolution profiles of a tablet formulation is, for example, a suitable software for this purpose. The study shows that F-CAD can calculate a good estimate of the disintegration time of a tablet formulation consisting of mefenamic acid. More important, F-CAD is capable of replacing expensive laboratory work by performing in silico experiments for the exploration of the formulation design space according to ICH guidance Q8 (R2). As a consequence, a similar workflow existing as best practice in the automotive and aircraft industry can be adopted by the pharmaceutical industry: The drug delivery vehicle can be first fully designed and tested in silico, which will improve the quality of the marketed formulation and save time and money. Copyright © 2013 Wiley Periodicals, Inc.

  20. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  1. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  2. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    DOE PAGES

    Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...

    2015-01-01

    This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less

  3. Metaworkflows and Workflow Interoperability for Heliophysics

    NASA Astrophysics Data System (ADS)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They- implement Science Cases (the definition of a scientific challenge) by composing different Basic Workflows. The third and last layer,Iterative Science Workflows, is developed in WSPGRADE. It executes sub-workflows (either Basic or Science Workflows) as parameter sweep jobs to investigate Science Cases on large multiple data sets. So far, this approach has proven fruitful for three Science Cases of which one has been completed and two are still being tested.

  4. Prevention of gross setup errors in radiotherapy with an efficient automatic patient safety system.

    PubMed

    Yan, Guanghua; Mittauer, Kathryn; Huang, Yin; Lu, Bo; Liu, Chihray; Li, Jonathan G

    2013-11-04

    Treatment of the wrong body part due to incorrect setup is among the leading types of errors in radiotherapy. The purpose of this paper is to report an efficient automatic patient safety system (PSS) to prevent gross setup errors. The system consists of a pair of charge-coupled device (CCD) cameras mounted in treatment room, a single infrared reflective marker (IRRM) affixed on patient or immobilization device, and a set of in-house developed software. Patients are CT scanned with a CT BB placed over their surface close to intended treatment site. Coordinates of the CT BB relative to treatment isocenter are used as reference for tracking. The CT BB is replaced with an IRRM before treatment starts. PSS evaluates setup accuracy by comparing real-time IRRM position with reference position. To automate system workflow, PSS synchronizes with the record-and-verify (R&V) system in real time and automatically loads in reference data for patient under treatment. Special IRRMs, which can permanently stick to patient face mask or body mold throughout the course of treatment, were designed to minimize therapist's workload. Accuracy of the system was examined on an anthropomorphic phantom with a designed end-to-end test. Its performance was also evaluated on head and neck as well as abdominalpelvic patients using cone-beam CT (CBCT) as standard. The PSS system achieved a seamless clinic workflow by synchronizing with the R&V system. By permanently mounting specially designed IRRMs on patient immobilization devices, therapist intervention is eliminated or minimized. Overall results showed that the PSS system has sufficient accuracy to catch gross setup errors greater than 1 cm in real time. An efficient automatic PSS with sufficient accuracy has been developed to prevent gross setup errors in radiotherapy. The system can be applied to all treatment sites for independent positioning verification. It can be an ideal complement to complex image-guidance systems due to its advantages of continuous tracking ability, no radiation dose, and fully automated clinic workflow.

  5. Dosimetric and workflow evaluation of first commercial synthetic CT software for clinical use in pelvis

    NASA Astrophysics Data System (ADS)

    Tyagi, Neelam; Fontenla, Sandra; Zhang, Jing; Cloutier, Michelle; Kadbi, Mo; Mechalakos, Jim; Zelefsky, Michael; Deasy, Joe; Hunt, Margie

    2017-04-01

    To evaluate a commercial synthetic CT (syn-CT) software for use in prostate radiotherapy. Twenty-five prostate patients underwent CT and MR simulation scans in treatment position on a 3T MR scanner. A commercially available MR protocol was used that included a T2w turbo spin-echo sequence for soft-tissue contrast and a dual echo 3D mDIXON fast field echo (FFE) sequence for generating syn-CT. A dual-echo 3D FFE B 0 map was used for patient-induced susceptibility distortion analysis and a new 3D balanced-FFE sequence was evaluated for identification of implanted gold fiducial markers and subsequent image-guidance during radiotherapy delivery. Tissues were classified as air, adipose, water, trabecular/spongy bone and compact/cortical bone and assigned bulk HU values. The accuracy of syn-CT for treatment planning was analyzed by transferring the structures and plan from planning CT to syn-CT and recalculating the dose. Accuracy of localization at the treatment machine was evaluated by comparing registration of kV radiographs to either digitally reconstructed radiographs (DRRs) generated from syn-CT or traditional DRRs generated from the planning CT. Similarly, accuracy of setup using CBCT and syn-CT was compared to that using the planning CT. Finally, a MR-only simulation workflow was established and end-to-end testing was completed on five patients undergoing MR-only simulation. Dosimetric comparison between the original CT and syn-CT plans was within 0.5% on average for all structures. The de-novo optimized plans on the syn-CT met institutional clinical objectives for target and normal structures. Patient-induced susceptibility distortion based on B 0 maps was within 1 mm and 0.5 mm in the body and prostate respectively. DRR and CBCT localization based on MR-localized fiducials showed a standard deviation of  <1 mm. End-to-end testing and MR simulation workflow was successfully validated. MRI derived synthetic CT can be successfully used for a MR-only planning and treatment for prostate radiotherapy.

  6. Using EHR audit trail logs to analyze clinical workflow: A case study from community-based ambulatory clinics.

    PubMed

    Wu, Danny T Y; Smart, Nikolas; Ciemins, Elizabeth L; Lanham, Holly J; Lindberg, Curt; Zheng, Kai

    2017-01-01

    To develop a workflow-supported clinical documentation system, it is a critical first step to understand clinical workflow. While Time and Motion studies has been regarded as the gold standard of workflow analysis, this method can be resource consuming and its data may be biased due to the cognitive limitation of human observers. In this study, we aimed to evaluate the feasibility and validity of using EHR audit trail logs to analyze clinical workflow. Specifically, we compared three known workflow changes from our previous study with the corresponding EHR audit trail logs of the study participants. The results showed that EHR audit trail logs can be a valid source for clinical workflow analysis, and can provide an objective view of clinicians' behaviors, multi-dimensional comparisons, and a highly extensible analysis framework.

  7. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    PubMed

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  8. Feature-based respiratory motion tracking in native fluoroscopic sequences for dynamic roadmaps during minimally invasive procedures in the thorax and abdomen

    NASA Astrophysics Data System (ADS)

    Wagner, Martin G.; Laeseke, Paul F.; Schubert, Tilman; Slagowski, Jordan M.; Speidel, Michael A.; Mistretta, Charles A.

    2017-03-01

    Fluoroscopic image guidance for minimally invasive procedures in the thorax and abdomen suffers from respiratory and cardiac motion, which can cause severe subtraction artifacts and inaccurate image guidance. This work proposes novel techniques for respiratory motion tracking in native fluoroscopic images as well as a model based estimation of vessel deformation. This would allow compensation for respiratory motion during the procedure and therefore simplify the workflow for minimally invasive procedures such as liver embolization. The method first establishes dynamic motion models for both the contrast-enhanced vasculature and curvilinear background features based on a native (non-contrast) and a contrast-enhanced image sequence acquired prior to device manipulation, under free breathing conditions. The model of vascular motion is generated by applying the diffeomorphic demons algorithm to an automatic segmentation of the subtraction sequence. The model of curvilinear background features is based on feature tracking in the native sequence. The two models establish the relationship between the respiratory state, which is inferred from curvilinear background features, and the vascular morphology during that same respiratory state. During subsequent fluoroscopy, curvilinear feature detection is applied to determine the appropriate vessel mask to display. The result is a dynamic motioncompensated vessel mask superimposed on the fluoroscopic image. Quantitative evaluation of the proposed methods was performed using a digital 4D CT-phantom (XCAT), which provides realistic human anatomy including sophisticated respiratory and cardiac motion models. Four groups of datasets were generated, where different parameters (cycle length, maximum diaphragm motion and maximum chest expansion) were modified within each image sequence. Each group contains 4 datasets consisting of the initial native and contrast enhanced sequences as well as a sequence, where the respiratory motion is tracked. The respiratory motion tracking error was between 1.00 % and 1.09 %. The estimated dynamic vessel masks yielded a Sørensen-Dice coefficient between 0.94 and 0.96. Finally, the accuracy of the vessel contours was measured in terms of the 99th percentile of the error, which ranged between 0.64 and 0.96 mm. The presented results show that the approach is feasible for respiratory motion tracking and compensation and could therefore considerably improve the workflow of minimally invasive procedures in the thorax and abdomen

  9. Pathway Concepts Experiment for Head-Down Synthetic Vision Displays

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2004-01-01

    Eight 757 commercial airline captains flew 22 approaches using the Reno Sparks 16R Visual Arrival under simulated Category I conditions. Approaches were flown using a head-down synthetic vision display to evaluate four tunnel ("minimal", "box", "dynamic pathway", "dynamic crow s feet") and three guidance ("ball", "tadpole", "follow-me aircraft") concepts and compare their efficacy to a baseline condition (i.e., no tunnel, ball guidance). The results showed that the tunnel concepts significantly improved pilot performance and situation awareness and lowered workload compared to the baseline condition. The dynamic crow s feet tunnel and follow-me aircraft guidance concepts were found to be the best candidates for future synthetic vision head-down displays. These results are discussed with implications for synthetic vision display design and future research.

  10. Physician activity during outpatient visits and subjective workload.

    PubMed

    Calvitti, Alan; Hochheiser, Harry; Ashfaq, Shazia; Bell, Kristin; Chen, Yunan; El Kareh, Robert; Gabuzda, Mark T; Liu, Lin; Mortensen, Sara; Pandey, Braj; Rick, Steven; Street, Richard L; Weibel, Nadir; Weir, Charlene; Agha, Zia

    2017-05-01

    We describe methods for capturing and analyzing EHR use and clinical workflow of physicians during outpatient encounters and relating activity to physicians' self-reported workload. We collected temporally-resolved activity data including audio, video, EHR activity, and eye-gaze along with post-visit assessments of workload. These data are then analyzed through a combination of manual content analysis and computational techniques to temporally align streams, providing a range of process measures of EHR usage, clinical workflow, and physician-patient communication. Data was collected from primary care and specialty clinics at the Veterans Administration San Diego Healthcare System and UCSD Health, who use Electronic Health Record (EHR) platforms, CPRS and Epic, respectively. Grouping visit activity by physician, site, specialty, and patient status enables rank-ordering activity factors by their correlation to physicians' subjective work-load as captured by NASA Task Load Index survey. We developed a coding scheme that enabled us to compare timing studies between CPRS and Epic and extract patient and visit complexity profiles. We identified similar patterns of EHR use and navigation at the 2 sites despite differences in functions, user interfaces and consequent coded representations. Both sites displayed similar proportions of EHR function use and navigation, and distribution of visit length, proportion of time physicians attended to EHRs (gaze), and subjective work-load as measured by the task load survey. We found that visit activity was highly variable across individual physicians, and the observed activity metrics ranged widely as correlates to subjective workload. We discuss implications of our study for methodology, clinical workflow and EHR redesign. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Design and implementation of workflow engine for service-oriented architecture

    NASA Astrophysics Data System (ADS)

    Peng, Shuqing; Duan, Huining; Chen, Deyun

    2009-04-01

    As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.

  12. Maintain and Regain Well Clear: Maneuver Guidance Designs for Pilots Performing the Detect-and-Avoid Task

    NASA Technical Reports Server (NTRS)

    Monk, Kevin J.; Roberts, Zachary

    2017-01-01

    In order to support the future expansion and integration of Unmanned Aircraft Systems (UAS), ongoing research efforts have sought to produce findings that inform the minimum display information elements required for acceptable UAS pilot response times and traffic avoidance. Previous simulations have revealed performance benefits associated with DAA displays containing predictive information and suggestive maneuver guidance tools in the form of banding. The present study investigated the impact of various maneuver guidance display configurations on detect-and-avoid (DAA) task performance in a simulated airspace environment. UAS pilots ability to maintain DAA well clear was compared between displays with either the presence or absence of green DAA bands, which indicated conflict-free flight regions. Additional display comparisons assessed pilots ability to regain DAA well clear with two different guidance presentations designed to aid in DAA well clear recovery during critical encounters. Performance implications and display considerations for future UAS DAA systems are discussed.

  13. Artificial intelligence: Learning to play Go from scratch

    NASA Astrophysics Data System (ADS)

    Singh, Satinder; Okun, Andy; Jackson, Andrew

    2017-10-01

    An artificial-intelligence program called AlphaGo Zero has mastered the game of Go without any human data or guidance. A computer scientist and two members of the American Go Association discuss the implications. See Article p.354

  14. A three-level atomicity model for decentralized workflow management systems

    NASA Astrophysics Data System (ADS)

    Ben-Shaul, Israel Z.; Heineman, George T.

    1996-12-01

    A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.

  15. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  16. How a surgeon becomes superman by visualization of intelligently fused multi-modalities

    NASA Astrophysics Data System (ADS)

    Erat, Okan; Pauly, Olivier; Weidert, Simon; Thaller, Peter; Euler, Ekkehard; Mutschler, Wolf; Navab, Nassir; Fallavollita, Pascal

    2013-03-01

    Motivation: The existing visualization of the Camera augmented mobile C-arm (CamC) system does not have enough cues for depth information and presents the anatomical information in a confusing way to surgeons. Methods: We propose a method that segments anatomical information from X-ray and then augment it on the video images. To provide depth cues, pixels belonging to video images are classified as skin and object classes. The augmentation of anatomical information from X-ray is performed only when pixels have a larger probability of belonging to skin class. Results: We tested our algorithm by displaying the new visualization to 2 expert surgeons and 1 medical student during three surgical workflow sequences of the interlocking of intramedullary nail procedure, namely: skin incision, center punching, and drilling. Via a survey questionnaire, they were asked to assess the new visualization when compared to the current alphablending overlay image displayed by CamC. The participants all agreed (100%) that occlusion and instrument tip position detection were immediately improved with our technique. When asked if our visualization has potential to replace the existing alpha-blending overlay during interlocking procedures, all participants did not hesitate to suggest an immediate integration of the visualization for the correct navigation and guidance of the procedure. Conclusion: Current alpha blending visualizations lack proper depth cues and can be a source of confusion for the surgeons when performing surgery. Our visualization concept shows great potential in alleviating occlusion and facilitating clinician understanding during specific workflow steps of the intramedullary nailing procedure.

  17. Novel System for Real-Time Integration of 3-D Echocardiography and Fluoroscopy for Image-Guided Cardiac Interventions: Preclinical Validation and Clinical Feasibility Evaluation.

    PubMed

    Arujuna, Aruna V; Housden, R James; Ma, Yingliang; Rajani, Ronak; Gao, Gang; Nijhof, Niels; Cathier, Pascal; Bullens, Roland; Gijsbers, Geert; Parish, Victoria; Kapetanakis, Stamatis; Hancock, Jane; Rinaldi, C Aldo; Cooklin, Michael; Gill, Jaswinder; Thomas, Martyn; O'neill, Mark D; Razavi, Reza; Rhode, Kawal S

    2014-01-01

    Real-time imaging is required to guide minimally invasive catheter-based cardiac interventions. While transesophageal echocardiography allows for high-quality visualization of cardiac anatomy, X-ray fluoroscopy provides excellent visualization of devices. We have developed a novel image fusion system that allows real-time integration of 3-D echocardiography and the X-ray fluoroscopy. The system was validated in the following two stages: 1) preclinical to determine function and validate accuracy; and 2) in the clinical setting to assess clinical workflow feasibility and determine overall system accuracy. In the preclinical phase, the system was assessed using both phantom and porcine experimental studies. Median 2-D projection errors of 4.5 and 3.3 mm were found for the phantom and porcine studies, respectively. The clinical phase focused on extending the use of the system to interventions in patients undergoing either atrial fibrillation catheter ablation (CA) or transcatheter aortic valve implantation (TAVI). Eleven patients were studied with nine in the CA group and two in the TAVI group. Successful real-time view synchronization was achieved in all cases with a calculated median distance error of 2.2 mm in the CA group and 3.4 mm in the TAVI group. A standard clinical workflow was established using the image fusion system. These pilot data confirm the technical feasibility of accurate real-time echo-fluoroscopic image overlay in clinical practice, which may be a useful adjunct for real-time guidance during interventional cardiac procedures.

  18. Safety and feasibility of STAT RAD: Improvement of a novel rapid tomotherapy-based radiation therapy workflow by failure mode and effects analysis.

    PubMed

    Jones, Ryan T; Handsfield, Lydia; Read, Paul W; Wilson, David D; Van Ausdal, Ray; Schlesinger, David J; Siebers, Jeffrey V; Chen, Quan

    2015-01-01

    The clinical challenge of radiation therapy (RT) for painful bone metastases requires clinicians to consider both treatment efficacy and patient prognosis when selecting a radiation therapy regimen. The traditional RT workflow requires several weeks for common palliative RT schedules of 30 Gy in 10 fractions or 20 Gy in 5 fractions. At our institution, we have created a new RT workflow termed "STAT RAD" that allows clinicians to perform computed tomographic (CT) simulation, planning, and highly conformal single fraction treatment delivery within 2 hours. In this study, we evaluate the safety and feasibility of the STAT RAD workflow. A failure mode and effects analysis (FMEA) was performed on the STAT RAD workflow, including development of a process map, identification of potential failure modes, description of the cause and effect, temporal occurrence, and team member involvement in each failure mode, and examination of existing safety controls. A risk probability number (RPN) was calculated for each failure mode. As necessary, workflow adjustments were then made to safeguard failure modes of significant RPN values. After workflow alterations, RPN numbers were again recomputed. A total of 72 potential failure modes were identified in the pre-FMEA STAT RAD workflow, of which 22 met the RPN threshold for clinical significance. Workflow adjustments included the addition of a team member checklist, changing simulation from megavoltage CT to kilovoltage CT, alteration of patient-specific quality assurance testing, and allocating increased time for critical workflow steps. After these modifications, only 1 failure mode maintained RPN significance; patient motion after alignment or during treatment. Performing the FMEA for the STAT RAD workflow before clinical implementation has significantly strengthened the safety and feasibility of STAT RAD. The FMEA proved a valuable evaluation tool, identifying potential problem areas so that we could create a safer workflow. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  19. 76 FR 71928 - Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-21

    ... Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011-D027... Wide Area WorkFlow (WAWF) and TRICARE Encounter Data System (TEDS). WAWF, which electronically... civil emergencies, when access to Wide Area WorkFlow by those contractors is not feasible; (4) Purchases...

  20. An Auto-management Thesis Program WebMIS Based on Workflow

    NASA Astrophysics Data System (ADS)

    Chang, Li; Jie, Shi; Weibo, Zhong

    An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.

  1. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    NASA Astrophysics Data System (ADS)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  2. New era in drug interaction evaluation: US Food and Drug Administration update on CYP enzymes, transporters, and the guidance process.

    PubMed

    Huang, Shiew-Mei; Strong, John M; Zhang, Lei; Reynolds, Kellie S; Nallani, Srikanth; Temple, Robert; Abraham, Sophia; Habet, Sayed Al; Baweja, Raman K; Burckart, Gilbert J; Chung, Sang; Colangelo, Philip; Frucht, David; Green, Martin D; Hepp, Paul; Karnaukhova, Elena; Ko, Hon-Sum; Lee, Jang-Ik; Marroum, Patrick J; Norden, Janet M; Qiu, Wei; Rahman, Atiqur; Sobel, Solomon; Stifano, Toni; Thummel, Kenneth; Wei, Xiao-Xiong; Yasuda, Sally; Zheng, Jenny H; Zhao, Hong; Lesko, Lawrence J

    2008-06-01

    Predicting clinically significant drug interactions during drug development is a challenge for the pharmaceutical industry and regulatory agencies. Since the publication of the US Food and Drug Administration's (FDA's) first in vitro and in vivo drug interaction guidance documents in 1997 and 1999, researchers and clinicians have gained a better understanding of drug interactions. This knowledge has enabled the FDA and the industry to progress and begin to overcome these challenges. The FDA has continued its efforts to evaluate methodologies to study drug interactions and communicate recommendations regarding the conduct of drug interaction studies, particularly for CYP-based and transporter-based drug interactions, to the pharmaceutical industry. A drug interaction Web site was established to document the FDA's current understanding of drug interactions (http://www.fda.gov/cder/drug/drugInteractions/default.htm). This report provides an overview of the evolution of the drug interaction guidances, includes a synopsis of the steps taken by the FDA to revise the original drug interaction guidance documents, and summarizes and highlights updated sections in the current guidance document, Drug Interaction Studies-Study Design, Data Analysis, and Implications for Dosing and Labeling.

  3. Sibling Group Play Therapy: An Effective Alternative with an Elective Mute Child.

    ERIC Educational Resources Information Center

    Barlow, Karen; And Others

    1986-01-01

    Presents the case study of an elective mute child. Describes the effects of sibling play therapy and lists implications for school counselors who might use group or sibling play therapy in their developmental guidance programs. (ABB)

  4. Disinfectant Residual: Representative Monitoring and Minimum Residual Implications

    EPA Science Inventory

    In this presentation we will: review history of distribution system chlorine monitoring siting, review State regulations and guidance, present a case study demonstrating a chlorine monitoring locations evaluation, and present an evaluation of Six–Year Review 3 (SYR3) disinfectant...

  5. Structuring clinical workflows for diabetes care: an overview of the OntoHealth approach.

    PubMed

    Schweitzer, M; Lasierra, N; Oberbichler, S; Toma, I; Fensel, A; Hoerbst, A

    2014-01-01

    Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view.

  6. Structuring Clinical Workflows for Diabetes Care

    PubMed Central

    Lasierra, N.; Oberbichler, S.; Toma, I.; Fensel, A.; Hoerbst, A.

    2014-01-01

    Summary Background Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. Objectives The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. Methods A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. Results This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. Conclusions The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view. PMID:25024765

  7. Modelling and analysis of workflow for lean supply chains

    NASA Astrophysics Data System (ADS)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  8. myExperiment: a repository and social network for the sharing of bioinformatics workflows

    PubMed Central

    Goble, Carole A.; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-01-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org. PMID:20501605

  9. Traversing the many paths of workflow research: developing a conceptual framework of workflow terminology through a systematic literature review

    PubMed Central

    Novak, Laurie L; Johnson, Kevin B; Lorenzi, Nancy M

    2010-01-01

    The objective of this review was to describe methods used to study and model workflow. The authors included studies set in a variety of industries using qualitative, quantitative and mixed methods. Of the 6221 matching abstracts, 127 articles were included in the final corpus. The authors collected data from each article on researcher perspective, study type, methods type, specific methods, approaches to evaluating quality of results, definition of workflow and dependent variables. Ethnographic observation and interviews were the most frequently used methods. Long study durations revealed the large time commitment required for descriptive workflow research. The most frequently discussed technique for evaluating quality of study results was triangulation. The definition of the term “workflow” and choice of methods for studying workflow varied widely across research areas and researcher perspectives. The authors developed a conceptual framework of workflow-related terminology for use in future research and present this model for use by other researchers. PMID:20442143

  10. Digitization workflows for flat sheets and packets of plants, algae, and fungi1

    PubMed Central

    Nelson, Gil; Sweeney, Patrick; Wallace, Lisa E.; Rabeler, Richard K.; Allard, Dorothy; Brown, Herrick; Carter, J. Richard; Denslow, Michael W.; Ellwood, Elizabeth R.; Germain-Aubrey, Charlotte C.; Gilbert, Ed; Gillespie, Emily; Goertzen, Leslie R.; Legler, Ben; Marchant, D. Blaine; Marsico, Travis D.; Morris, Ashley B.; Murrell, Zack; Nazaire, Mare; Neefus, Chris; Oberreiter, Shanna; Paul, Deborah; Ruhfel, Brad R.; Sasek, Thomas; Shaw, Joey; Soltis, Pamela S.; Watson, Kimberly; Weeks, Andrea; Mast, Austin R.

    2015-01-01

    Effective workflows are essential components in the digitization of biodiversity specimen collections. To date, no comprehensive, community-vetted workflows have been published for digitizing flat sheets and packets of plants, algae, and fungi, even though latest estimates suggest that only 33% of herbarium specimens have been digitally transcribed, 54% of herbaria use a specimen database, and 24% are imaging specimens. In 2012, iDigBio, the U.S. National Science Foundation’s (NSF) coordinating center and national resource for the digitization of public, nonfederal U.S. collections, launched several working groups to address this deficiency. Here, we report the development of 14 workflow modules with 7–36 tasks each. These workflows represent the combined work of approximately 35 curators, directors, and collections managers representing more than 30 herbaria, including 15 NSF-supported plant-related Thematic Collections Networks and collaboratives. The workflows are provided for download as Portable Document Format (PDF) and Microsoft Word files. Customization of these workflows for specific institutional implementation is encouraged. PMID:26421256

  11. U.S. Army Leader’s Handbook: Trauma in the Unit

    DTIC Science & Technology

    2007-01-01

    Implications For Leadership 39 Implications For Working With Families 39 Part VI. Understanding The Impact Of War Zone Deployments 40 6.1 Stressors Of...unit families. Arrange Public Affairs Office (PAO) to provide guidance to unit families on dealing with the media. Talk with families about the ...a level of functioning equal to or greater than before the stressor event. To achieve this, an individual must be able to be flexible and stay

  12. Disruption of Radiologist Workflow.

    PubMed

    Kansagra, Akash P; Liu, Kevin; Yu, John-Paul J

    2016-01-01

    The effect of disruptions has been studied extensively in surgery and emergency medicine, and a number of solutions-such as preoperative checklists-have been implemented to enforce the integrity of critical safety-related workflows. Disruptions of the highly complex and cognitively demanding workflow of modern clinical radiology have only recently attracted attention as a potential safety hazard. In this article, we describe the variety of disruptions that arise in the reading room environment, review approaches that other specialties have taken to mitigate workflow disruption, and suggest possible solutions for workflow improvement in radiology. Copyright © 2015 Mosby, Inc. All rights reserved.

  13. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  14. Workflows for microarray data processing in the Kepler environment.

    PubMed

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.

  15. Dynamic reusable workflows for ocean science

    USGS Publications Warehouse

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic notebooks across the geoscience domains.

  16. Deploying and sharing U-Compare workflows as web services.

    PubMed

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  17. Deploying and sharing U-Compare workflows as web services

    PubMed Central

    2013-01-01

    Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. Results We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. Conclusions The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform. PMID:23419017

  18. Health information exchange technology on the front lines of healthcare: workflow factors and patterns of use

    PubMed Central

    Johnson, Kevin B; Lorenzi, Nancy M

    2011-01-01

    Objective The goal of this study was to develop an in-depth understanding of how a health information exchange (HIE) fits into clinical workflow at multiple clinical sites. Materials and Methods The ethnographic qualitative study was conducted over a 9-month period in six emergency departments (ED) and eight ambulatory clinics in Memphis, Tennessee, USA. Data were collected using direct observation, informal interviews during observation, and formal semi-structured interviews. The authors observed for over 180 h, during which providers used the exchange 130 times. Results HIE-related workflow was modeled for each ED site and ambulatory clinic group and substantial site-to-site workflow differences were identified. Common patterns in HIE-related workflow were also identified across all sites, leading to the development of two role-based workflow models: nurse based and physician based. The workflow elements framework was applied to the two role-based patterns. An in-depth description was developed of how providers integrated HIE into existing clinical workflow, including prompts for HIE use. Discussion Workflow differed substantially among sites, but two general role-based HIE usage models were identified. Although providers used HIE to improve continuity of patient care, patient–provider trust played a significant role. Types of information retrieved related to roles, with nurses seeking to retrieve recent hospitalization data and more open-ended usage by nurse practitioners and physicians. User and role-specific customization to accommodate differences in workflow and information needs may increase the adoption and use of HIE. Conclusion Understanding end users' perspectives towards HIE technology is crucial to the long-term success of HIE. By applying qualitative methods, an in-depth understanding of HIE usage was developed. PMID:22003156

  19. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    PubMed

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  20. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  1. Support for Taverna workflows in the VPH-Share cloud platform.

    PubMed

    Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F

    2017-07-01

    To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. A navigation system for percutaneous needle interventions based on PET/CT images: design, workflow and error analysis of soft tissue and bone punctures.

    PubMed

    Oliveira-Santos, Thiago; Klaeser, Bernd; Weitzel, Thilo; Krause, Thomas; Nolte, Lutz-Peter; Peterhans, Matthias; Weber, Stefan

    2011-01-01

    Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well as highlighting the differences between bone and soft tissue punctures. An overall average error of 4.23 mm and 3.07 mm for bone and soft tissue punctures, respectively, demonstrated the feasibility of using this system for such interventions. The proposed system workflow was shown to be effective in separating the preparation from the sterile phase, as well as in keeping the system manageable by a single operator. Among the distinct sources of error, the user error based on the system accuracy (defined as the distance from the planned target to the actual needle tip) appeared to be the most significant. Bone punctures showed higher user error, whereas soft tissue punctures showed higher tissue deformation error.

  3. Identifying impact of software dependencies on replicability of biomedical workflows.

    PubMed

    Miksa, Tomasz; Rauber, Andreas; Mina, Eleni

    2016-12-01

    Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  5. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-03-01

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  6. A Model of Workflow Composition for Emergency Management

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  7. A Framework for Modeling Workflow Execution by an Interdisciplinary Healthcare Team.

    PubMed

    Kezadri-Hamiaz, Mounira; Rosu, Daniela; Wilk, Szymon; Kuziemsky, Craig; Michalowski, Wojtek; Carrier, Marc

    2015-01-01

    The use of business workflow models in healthcare is limited because of insufficient capture of complexities associated with behavior of interdisciplinary healthcare teams that execute healthcare workflows. In this paper we present a novel framework that builds on the well-founded business workflow model formalism and related infrastructures and introduces a formal semantic layer that describes selected aspects of team dynamics and supports their real-time operationalization.

  8. Epiviz: a view inside the design of an integrated visual analysis software for genomics

    PubMed Central

    2015-01-01

    Background Computational and visual data analysis for genomics has traditionally involved a combination of tools and resources, of which the most ubiquitous consist of genome browsers, focused mainly on integrative visualization of large numbers of big datasets, and computational environments, focused on data modeling of a small number of moderately sized datasets. Workflows that involve the integration and exploration of multiple heterogeneous data sources, small and large, public and user specific have been poorly addressed by these tools. In our previous work, we introduced Epiviz, which bridges the gap between the two types of tools, simplifying these workflows. Results In this paper we expand on the design decisions behind Epiviz, and introduce a series of new advanced features that further support the type of interactive exploratory workflow we have targeted. We discuss three ways in which Epiviz advances the field of genomic data analysis: 1) it brings code to interactive visualizations at various different levels; 2) takes the first steps in the direction of collaborative data analysis by incorporating user plugins from source control providers, as well as by allowing analysis states to be shared among the scientific community; 3) combines established analysis features that have never before been available simultaneously in a genome browser. In our discussion section, we present security implications of the current design, as well as a series of limitations and future research steps. Conclusions Since many of the design choices of Epiviz are novel in genomics data analysis, this paper serves both as a document of our own approaches with lessons learned, as well as a start point for future efforts in the same direction for the genomics community. PMID:26328750

  9. Design and implementation of a secure workflow system based on PKI/PMI

    NASA Astrophysics Data System (ADS)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  10. Implication of high dynamic range and wide color gamut content distribution

    NASA Astrophysics Data System (ADS)

    Lu, Taoran; Pu, Fangjun; Yin, Peng; Chen, Tao; Husak, Walt

    2015-09-01

    High Dynamic Range (HDR) and Wider Color Gamut (WCG) content represents a greater range of luminance levels and a more complete reproduction of colors found in real-world scenes. The current video distribution environments deliver Standard Dynamic Range (SDR) signal. Therefore, there might be some significant implication on today's end-to-end ecosystem from content creation to distribution and finally to consumption. For SDR content, the common practice is to apply compression on Y'CbCr 4:2:0 using gamma transfer function and non-constant luminance 4:2:0 chroma subsampling. For HDR and WCG content, it is desirable to examine if such signal format still works well for compression, and it is interesting to know if the overall system performance can be further improved by exploring different signal formats and processing workflows. In this paper, we will provide some of our insight into those problems.

  11. Implications of the Fukushima Nuclear Disaster: Man-Made Hazards, Vulnerability Factors, and Risk to Environmental Health.

    PubMed

    Eddy, Christopher; Sase, Eriko

    2015-01-01

    The objective of this article was to examine the environmental health implications of the 2011 Fukushima nuclear disaster from an all-hazards perspective. The authors performed a literature review that included Japanese and international nuclear guidance and policy, scientific papers, and reports on the Chernobyl and Three Mile Island disasters while also considering all-hazards preparedness rubrics in the U.S. The examination of the literature resulted in the following: a) the authors' "All-Hazards Planning Reference Model" that distinguishes three planning categories-Disaster Trigger Event, Man-Made Hazards, and Vulnerability Factors; b) the generalization of their model to other countries; and c) advocacy for environmental health end fate to be considered in planning phases to minimize risk to environmental health. This article discusses inconsistencies in disaster planning and nomenclature existing in the studied materials and international guidance and proposes new opportunity for developing predisaster risk assessment, risk communication, and prevention capacity building.

  12. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  13. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  14. Geometric reconstruction using tracked ultrasound strain imaging

    NASA Astrophysics Data System (ADS)

    Pheiffer, Thomas S.; Simpson, Amber L.; Ondrake, Janet E.; Miga, Michael I.

    2013-03-01

    The accurate identification of tumor margins during neurosurgery is a primary concern for the surgeon in order to maximize resection of malignant tissue while preserving normal function. The use of preoperative imaging for guidance is standard of care, but tumor margins are not always clear even when contrast agents are used, and so margins are often determined intraoperatively by visual and tactile feedback. Ultrasound strain imaging creates a quantitative representation of tissue stiffness which can be used in real-time. The information offered by strain imaging can be placed within a conventional image-guidance workflow by tracking the ultrasound probe and calibrating the image plane, which facilitates interpretation of the data by placing it within a common coordinate space with preoperative imaging. Tumor geometry in strain imaging is then directly comparable to the geometry in preoperative imaging. This paper presents a tracked ultrasound strain imaging system capable of co-registering with preoperative tomograms and also of reconstructing a 3D surface using the border of the strain lesion. In a preliminary study using four phantoms with subsurface tumors, tracked strain imaging was registered to preoperative image volumes and then tumor surfaces were reconstructed using contours extracted from strain image slices. The volumes of the phantom tumors reconstructed from tracked strain imaging were approximately between 1.5 to 2.4 cm3, which was similar to the CT volumes of 1.0 to 2.3 cm3. Future work will be done to robustly characterize the reconstruction accuracy of the system.

  15. Ethical considerations in forensic genetics research on tissue samples collected post-mortem in Cape Town, South Africa.

    PubMed

    Heathfield, Laura J; Maistry, Sairita; Martin, Lorna J; Ramesar, Raj; de Vries, Jantina

    2017-11-29

    The use of tissue collected at a forensic post-mortem for forensic genetics research purposes remains of ethical concern as the process involves obtaining informed consent from grieving family members. Two forensic genetics research studies using tissue collected from a forensic post-mortem were recently initiated at our institution and were the first of their kind to be conducted in Cape Town, South Africa. This article discusses some of the ethical challenges that were encountered in these research projects. Among these challenges was the adaptation of research workflows to fit in with an exceptionally busy service delivery that is operating with limited resources. Whilst seeking guidance from the literature regarding research on deceased populations, it was noted that next of kin of decedents are not formally recognised as a vulnerable group in the existing ethical and legal frameworks in South Africa. The authors recommend that research in the forensic mortuary setting is approached using guidance for vulnerable groups, and the benefit to risk standard needs to be strongly justified. Lastly, when planning forensic genetics research, consideration must be given to the potential of uncovering incidental findings, funding to validate these findings and the feedback of results to family members; the latter of which is recommended to occur through a genetic counsellor. It is hoped that these experiences will contribute towards a formal framework for conducting forensic genetic research in medico-legal mortuaries in South Africa.

  16. A comparison study of atlas-based 3D cardiac MRI segmentation: global versus global and local transformations

    NASA Astrophysics Data System (ADS)

    Daryanani, Aditya; Dangi, Shusil; Ben-Zikri, Yehuda Kfir; Linte, Cristian A.

    2016-03-01

    Magnetic Resonance Imaging (MRI) is a standard-of-care imaging modality for cardiac function assessment and guidance of cardiac interventions thanks to its high image quality and lack of exposure to ionizing radiation. Cardiac health parameters such as left ventricular volume, ejection fraction, myocardial mass, thickness, and strain can be assessed by segmenting the heart from cardiac MRI images. Furthermore, the segmented pre-operative anatomical heart models can be used to precisely identify regions of interest to be treated during minimally invasive therapy. Hence, the use of accurate and computationally efficient segmentation techniques is critical, especially for intra-procedural guidance applications that rely on the peri-operative segmentation of subject-specific datasets without delaying the procedure workflow. Atlas-based segmentation incorporates prior knowledge of the anatomy of interest from expertly annotated image datasets. Typically, the ground truth atlas label is propagated to a test image using a combination of global and local registration. The high computational cost of non-rigid registration motivated us to obtain an initial segmentation using global transformations based on an atlas of the left ventricle from a population of patient MRI images and refine it using well developed technique based on graph cuts. Here we quantitatively compare the segmentations obtained from the global and global plus local atlases and refined using graph cut-based techniques with the expert segmentations according to several similarity metrics, including Dice correlation coefficient, Jaccard coefficient, Hausdorff distance, and Mean absolute distance error.

  17. Cognitive load of navigating without vision when guided by virtual sound versus spatial language.

    PubMed

    Klatzky, Roberta L; Marston, James R; Giudice, Nicholas A; Golledge, Reginald G; Loomis, Jack M

    2006-12-01

    A vibrotactile N-back task was used to generate cognitive load while participants were guided along virtual paths without vision. As participants stepped in place, they moved along a virtual path of linear segments. Information was provided en route about the direction of the next turning point, by spatial language ("left," "right," or "straight") or virtual sound (i.e., the perceived azimuth of the sound indicated the target direction). The authors hypothesized that virtual sound, being processed at direct perceptual levels, would have lower load than even simple language commands, which require cognitive mediation. As predicted, whereas the guidance modes did not differ significantly in the no-load condition, participants showed shorter distance traveled and less time to complete a path when performing the N-back task while navigating with virtual sound as guidance. Virtual sound also produced better N-back performance than spatial language. By indicating the superiority of virtual sound for guidance when cognitive load is present, as is characteristic of everyday navigation, these results have implications for guidance systems for the visually impaired and others.

  18. Worklist handling in workflow-enabled radiological application systems

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  19. Nanocuration workflows: Establishing best practices for identifying, inputting, and sharing data to inform decisions on nanomaterials

    PubMed Central

    Powers, Christina M; Mills, Karmann A; Morris, Stephanie A; Klaessig, Fred; Gaheen, Sharon; Lewinski, Nastassja

    2015-01-01

    Summary There is a critical opportunity in the field of nanoscience to compare and integrate information across diverse fields of study through informatics (i.e., nanoinformatics). This paper is one in a series of articles on the data curation process in nanoinformatics (nanocuration). Other articles in this series discuss key aspects of nanocuration (temporal metadata, data completeness, database integration), while the focus of this article is on the nanocuration workflow, or the process of identifying, inputting, and reviewing nanomaterial data in a data repository. In particular, the article discusses: 1) the rationale and importance of a defined workflow in nanocuration, 2) the influence of organizational goals or purpose on the workflow, 3) established workflow practices in other fields, 4) current workflow practices in nanocuration, 5) key challenges for workflows in emerging fields like nanomaterials, 6) examples to make these challenges more tangible, and 7) recommendations to address the identified challenges. Throughout the article, there is an emphasis on illustrating key concepts and current practices in the field. Data on current practices in the field are from a group of stakeholders active in nanocuration. In general, the development of workflows for nanocuration is nascent, with few individuals formally trained in data curation or utilizing available nanocuration resources (e.g., ISA-TAB-Nano). Additional emphasis on the potential benefits of cultivating nanomaterial data via nanocuration processes (e.g., capability to analyze data from across research groups) and providing nanocuration resources (e.g., training) will likely prove crucial for the wider application of nanocuration workflows in the scientific community. PMID:26425437

  20. HHS guidance on synthetic DNA is the right step.

    PubMed

    Gronvall, Gigi Kwik

    2010-12-01

    Synthetic biology has advanced to the point where some pathogens can be manufactured from scratch. This technical leap has beneficent implications for medical research and vaccine design, but it also raises concerns that the technology could be used to produce a deadly pathogen for nefarious use. Addressing these concerns, the Department of Health and Human Services (HHS) released their Screening Framework Guidance for Providers of Synthetic Double-Stranded DNA on October 13, 2010. They took the right approach: The oversight framework for gene synthesis companies included in this guidance is adaptable to new technical developments and changing risks, it can be implemented immediately, it can be readily adopted by other countries, and it will cost little. Though there have been some calls to increase the regulatory controls on synthetic biology, these should be resisted. For now, at least, the oversight is appropriate to the risks.

  1. Reprint of: environmental information for military planning.

    PubMed

    Doherty, Victoria; Croft, Darryl; Knight, Ashley

    2013-11-01

    A study was conducted to consider the implications of presenting Environmental Information (EI; information on current environmental features including weather, topography and visibility maps) for military planning to the growing audience of non-technical users; to provide guidance for ensuring usability and for development of a suitable EI interface, and to produce an EI concept interface mock-up to demonstrate initial design ideas. Knowledge was elicited from current EI users and providers regarding anticipated use of EI by non-specialists. This was combined with human factors and cognition expertise to produce guidance for data usability and development of an EI interface. A simple mock-up of an EI concept interface was developed. Recommendations for further development were made including application of the guidance derived, identification of a user test-bed and development of business processes. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Environmental information for military planning.

    PubMed

    Doherty, Victoria; Croft, Darryl; Knight, Ashley

    2013-07-01

    A study was conducted to consider the implications of presenting Environmental Information (EI; information on current environmental features including weather, topography and visibility maps) for military planning to the growing audience of non-technical users; to provide guidance for ensuring usability and for development of a suitable EI interface, and to produce an EI concept interface mock-up to demonstrate initial design ideas. Knowledge was elicited from current EI users and providers regarding anticipated use of EI by non-specialists. This was combined with human factors and cognition expertise to produce guidance for data usability and development of an EI interface. A simple mock-up of an EI concept interface was developed. Recommendations for further development were made including application of the guidance derived, identification of a user test-bed and development of business processes. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Flamingo, a seven-pass transmembrane cadherin, cooperates with Netrin/Frazzled in Drosophila midline guidance.

    PubMed

    Organisti, Cristina; Hein, Irina; Grunwald Kadow, Ilona C; Suzuki, Takashi

    2015-01-01

    During central nervous system development, several guidance cues and receptors, as well as cell adhesion molecules, are required for guiding axons across the midline and along the anterior-posterior axis. In Drosophila, commissural axons sense the midline attractants Netrin A and B (Net) through Frazzled (Fra) receptors. Despite their importance, lack of Net or fra affects only some commissures, suggesting that additional molecules can fulfill this function. Recently, planar cell polarity (PCP) proteins have been implicated in midline axon guidance in both vertebrate and invertebrate systems. Here, we report that the atypical cadherin and PCP molecule Flamingo/Starry night (Fmi/Stan) acts jointly with Net/Fra signaling during midline development. Additional removal of fmi strongly increases the guidance defects in Net/fra mutants. Rescue and domain deletion experiments suggest that Fmi signaling facilitates commissural pathfinding potentially by mediating axonal fasciculation in a partly homophilic manner. Altogether, our results indicate that contact-mediated cell adhesion via Fmi acts in addition to the Net/Fra guidance system during axon pathfinding across the midline, underlining the importance of PCP molecules during vertebrates and invertebrates midline development. © 2014 The Authors Genes to Cells © 2014 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.

  4. Implementing bioinformatic workflows within the bioextract server

    USDA-ARS?s Scientific Manuscript database

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  5. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    NASA Astrophysics Data System (ADS)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high-level scientific workflow middleware enables reproducibility of results more convenient and also provides a reusable and portable workflow template that can be deployed across different computing infrastructures. Acknowledgements This work was kindly supported by NordForsk as part of the Nordic Center of Excellence (NCoE) eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) and the Top-level Research Initiative NCoE SVALI (Stability and Variation of Arctic Land Ice).

  6. Targeting Accuracy, Procedure Times and User Experience of 240 Experimental MRI Biopsies Guided by a Clinical Add-On Navigation System.

    PubMed

    Busse, Harald; Riedel, Tim; Garnov, Nikita; Thörmer, Gregor; Kahn, Thomas; Moche, Michael

    2015-01-01

    MRI is of great clinical utility for the guidance of special diagnostic and therapeutic interventions. The majority of such procedures are performed iteratively ("in-and-out") in standard, closed-bore MRI systems with control imaging inside the bore and needle adjustments outside the bore. The fundamental limitations of such an approach have led to the development of various assistance techniques, from simple guidance tools to advanced navigation systems. The purpose of this work was to thoroughly assess the targeting accuracy, workflow and usability of a clinical add-on navigation solution on 240 simulated biopsies by different medical operators. Navigation relied on a virtual 3D MRI scene with real-time overlay of the optically tracked biopsy needle. Smart reference markers on a freely adjustable arm ensured proper registration. Twenty-four operators - attending (AR) and resident radiologists (RR) as well as medical students (MS) - performed well-controlled biopsies of 10 embedded model targets (mean diameter: 8.5 mm, insertion depths: 17-76 mm). Targeting accuracy, procedure times and 13 Likert scores on system performance were determined (strong agreement: 5.0). Differences in diagnostic success rates (AR: 93%, RR: 88%, MS: 81%) were not significant. In contrast, between-group differences in biopsy times (AR: 4:15, RR: 4:40, MS: 5:06 min:sec) differed significantly (p<0.01). Mean overall rating was 4.2. The average operator would use the system again (4.8) and stated that the outcome justifies the extra effort (4.4). Lowest agreement was reported for the robustness against external perturbations (2.8). The described combination of optical tracking technology with an automatic MRI registration appears to be sufficiently accurate for instrument guidance in a standard (closed-bore) MRI environment. High targeting accuracy and usability was demonstrated on a relatively large number of procedures and operators. Between groups with different expertise there were significant differences in experimental procedure times but not in the number of successful biopsies.

  7. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    PubMed

    Hartman, Douglas J

    2015-06-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    PubMed

    Hartman, Douglas J

    2016-03-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. [Intraoperative augmented reality visualization. Current state of development and initial experiences with the CamC].

    PubMed

    Weidert, S; Wang, L; von der Heide, A; Navab, N; Euler, E

    2012-03-01

    The intraoperative application of augmented reality (AR) has so far mainly taken place in the field of endoscopy. Here, the camera image of the endoscope was augmented by computer graphics derived mostly from preoperative imaging. Due to the complex setup and operation of the devices, they have not yet become part of routine clinical practice. The Camera Augmented Mobile C-arm (CamC) that extends a classic C-arm by a video camera and mirror construction is characterized by its uncomplicated handling. It combines its video live stream geometrically correct with the acquired X-ray. The clinical application of the device in 43 cases showed the strengths of the device in positioning for X-ray acquisition, incision placement, K-wire placement, and instrument guidance. With its new function and the easy integration into the OR workflow of any procedure that requires X-ray imaging, the CamC has the potential to become the first widely used AR technology for orthopedic and trauma surgery.

  10. Melinda – A custom search engine that provides access to locally-developed content using the HL7 Infobutton standard

    PubMed Central

    Wan, Yik-Ki J.; Staes, Catherine J.

    2016-01-01

    Healthcare organizations use care pathways to standardize care, but once developed, adoption rates often remain low. One challenge for usage concerns clinicians’ difficulty in accessing guidance when it is most needed. Although the HL7 ‘Infobutton Standard’ allows clinicians easier access to external references, access to locally-developed resources often requires clinicians to deviate from their normal electronic health record (EHR) workflow to use another application. To address this gap between internal and external resources, we reviewed the literature and existing practices at the University of Utah Health Care. We identify the requirements to meet the needs of a healthcare enterprise and clinicians, describe the design and development of a prototype to aggregate both internal and external resources from within or outside the EHR, and evaluated strengths and limitations of the prototype. The system is functional but not implemented in a live EHR environment. We suggest next steps and enhancements. PMID:28269964

  11. Integrating prediction, provenance, and optimization into high energy workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schram, M.; Bansal, V.; Friese, R. D.

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Is Non-invasive Image-Guided Breast Brachytherapy Good? – Jess Hiatt, MS Non-invasive Image-Guided Breast Brachytherapy (NIBB) is an emerging therapy for breast boost treatments as well as Accelerated Partial Breast Irradiation (APBI) using HDR surface breast brachytherapy. NIBB allows for smaller treatment volumes while maintaining optimal target coverage. Considering the real-time image-guidance and immobilization provided by the NIBB modality, minimal margins around the target tissue are necessary. Accelerated Partial Breast Irradiation in brachytherapy: is shorter better? - Dorin Todor, PhD VCU A review of balloon and strut devices will be provided together with the origins of APBI: the interstitial multi-catheter implant.more » A dosimetric and radiobiological perspective will help point out the evolution in breast brachytherapy, both in terms of devices and the protocols/clinical trials under which these devices are used. Improvements in imaging, delivery modalities and convenience are among the factors driving the ultrashort fractionation schedules but our understanding of both local control and toxicities associated with various treatments is lagging. A comparison between various schedules, from a radiobiological perspective, will be given together with a critical analysis of the issues. to review and understand the evolution and development of APBI using brachytherapy methods to understand the basis and limitations of radio-biological ‘equivalence’ between fractionation schedules to review commonly used and proposed fractionation schedules Intra-operative breast brachytherapy: Is one stop shopping best?- Bruce Libby, PhD. University of Virginia A review of intraoperative breast brachytherapy will be presented, including the Targit-A and other trials that have used electronic brachytherapy. More modern approaches, in which the lumpectomy procedure is integrated into an APBI workflow, will also be discussed. Learning Objectives: To review past and current clinical trials for IORT To discuss lumpectomy-scan-plan-treat workflow for IORT.« less

  13. Proposed Standards for Variable Harmonization Documentation and Referencing: A Case Study Using QuickCharmStats 1.1

    PubMed Central

    Winters, Kristi; Netscher, Sebastian

    2016-01-01

    Comparative statistical analyses often require data harmonization, yet the social sciences do not have clear operationalization frameworks that guide and homogenize variable coding decisions across disciplines. When faced with a need to harmonize variables researchers often look for guidance from various international studies that employ output harmonization, such as the Comparative Survey of Election Studies, which offer recoding structures for the same variable (e.g. marital status). More problematically there are no agreed documentation standards or journal requirements for reporting variable harmonization to facilitate a transparent replication process. We propose a conceptual and data-driven digital solution that creates harmonization documentation standards for publication and scholarly citation: QuickCharmStats 1.1. It is free and open-source software that allows for the organizing, documenting and publishing of data harmonization projects. QuickCharmStats starts at the conceptual level and its workflow ends with a variable recording syntax. It is therefore flexible enough to reflect a variety of theoretical justifications for variable harmonization. Using the socio-demographic variable ‘marital status’, we demonstrate how the CharmStats workflow collates metadata while being guided by the scientific standards of transparency and replication. It encourages researchers to publish their harmonization work by providing researchers who complete the peer review process a permanent identifier. Those who contribute original data harmonization work to their discipline can now be credited through citations. Finally, we propose peer-review standards for harmonization documentation, describe a route to online publishing, and provide a referencing format to cite harmonization projects. Although CharmStats products are designed for social scientists our adherence to the scientific method ensures our products can be used by researchers across the sciences. PMID:26859494

  14. Real-world visual search is dominated by top-down guidance.

    PubMed

    Chen, Xin; Zelinsky, Gregory J

    2006-11-01

    How do bottom-up and top-down guidance signals combine to guide search behavior? Observers searched for a target either with or without a preview (top-down manipulation) or a color singleton (bottom-up manipulation) among the display objects. With a preview, reaction times were faster and more initial eye movements were guided to the target; the singleton failed to attract initial saccades under these conditions. Only in the absence of a preview did subjects preferentially fixate the color singleton. We conclude that the search for realistic objects is guided primarily by top-down control. Implications for saliency map models of visual search are discussed.

  15. Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.

    2013-12-01

    The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.

  16. Structuring research methods and data with the research object model: genomics workflows as a case study.

    PubMed

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  17. Comparison of manual and automated AmpliSeq™ workflows in the typing of a Somali population with the Precision ID Identity Panel.

    PubMed

    van der Heijden, Suzanne; de Oliveira, Susanne Juel; Kampmann, Marie-Louise; Børsting, Claus; Morling, Niels

    2017-11-01

    The Precision ID Identity Panel was used to type 109 Somali individuals in order to obtain allele frequencies for the Somali population. These frequencies were used to establish a Somali HID-SNP database, which will be used for the biostatistic calculations in family and immigration cases. Genotypes obtained with the Precision ID Identity Panel were found to be almost in complete concordance with genotypes obtained with the SNPforID PCR-SBE-CE assay. In seven SNP loci, silent alleles were identified, of which most were previously described in the literature. The project also set out to compare different AmpliSeq™ workflows to investigate the possibility of using automated library building in forensic genetic case work. In order to do so, the SNP typing of the Somalis was performed using three different workflows: 1) manual library building and sequencing on the Ion PGM™, 2) automated library building using the Biomek ® 3000 and sequencing on the Ion PGM™, and 3) automated library building using the Ion Chef™ and sequencing on the Ion S5™. AmpliSeq™ workflows were compared based on coverage, locus balance, noise, and heterozygote balance. Overall, the Ion Chef™/Ion S5™ workflow was found to give the best results and required least hands-on time in the laboratory. However, the Ion Chef™/Ion S5™ workflow was also the most expensive. The number of libraries that may be constructed in one Ion Chef™ library building run was limited to eight, which is too little for high throughput workflows. The Biomek ® 3000/Ion PGM™ workflow was found to perform similarly to the manual/Ion PGM™ workflow. This argues for the use of automated library building in forensic genetic case work. Automated library building decreases the workload of the laboratory staff, decreases the risk of pipetting errors, and simplifies the daily workflow in forensic genetic laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  19. A scientific workflow framework for (13)C metabolic flux analysis.

    PubMed

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  1. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  2. Independent signaling by Drosophila insulin receptor for axon guidance and growth.

    PubMed

    Li, Caroline R; Guo, Dongyu; Pick, Leslie

    2013-01-01

    The Drosophila insulin receptor (DInR) regulates a diverse array of biological processes including growth, axon guidance, and sugar homeostasis. Growth regulation by DInR is mediated by Chico, the Drosophila homolog of vertebrate insulin receptor substrate proteins IRS1-4. In contrast, DInR regulation of photoreceptor axon guidance in the developing visual system is mediated by the SH2-SH3 domain adaptor protein Dreadlocks (Dock). In vitro studies by others identified five NPXY motifs, one in the juxtamembrane region and four in the signaling C-terminal tail (C-tail), important for interaction with Chico. Here we used yeast two-hybrid assays to identify regions in the DInR C-tail that interact with Dock. These Dock binding sites were in separate portions of the C-tail from the previously identified Chico binding sites. To test whether these sites are required for growth or axon guidance in whole animals, a panel of DInR proteins, in which the putative Chico and Dock interaction sites had been mutated individually or in combination, were tested for their ability to rescue viability, growth and axon guidance defects of dinr mutant flies. Sites required for viability were identified. Unexpectedly, mutation of both putative Dock binding sites, either individually or in combination, did not lead to defects in photoreceptor axon guidance. Thus, either sites also required for viability are necessary for DInR function in axon guidance and/or there is redundancy built into the DInR/Dock interaction such that Dock is able to interact with multiple regions of DInR. We also found that simultaneous mutation of all five NPXY motifs implicated in Chico interaction drastically decreased growth in both male and female adult flies. These animals resembled chico mutants, supporting the notion that DInR interacts directly with Chico in vivo to control body size. Mutation of these five NPXY motifs did not affect photoreceptor axon guidance, segregating the roles of DInR in the processes of growth and axon guidance.

  3. Performance of an Automated Versus a Manual Whole-Body Magnetic Resonance Imaging Workflow.

    PubMed

    Stocker, Daniel; Finkenstaedt, Tim; Kuehn, Bernd; Nanz, Daniel; Klarhoefer, Markus; Guggenberger, Roman; Andreisek, Gustav; Kiefer, Berthold; Reiner, Caecilia S

    2018-04-24

    The aim of this study was to evaluate the performance of an automated workflow for whole-body magnetic resonance imaging (WB-MRI), which reduces user interaction compared with the manual WB-MRI workflow. This prospective study was approved by the local ethics committee. Twenty patients underwent WB-MRI for myopathy evaluation on a 3 T MRI scanner. Ten patients (7 women; age, 52 ± 13 years; body weight, 69.9 ± 13.3 kg; height, 173 ± 9.3 cm; body mass index, 23.2 ± 3.0) were examined with a prototypical automated WB-MRI workflow, which automatically segments the whole body, and 10 patients (6 women; age, 35.9 ± 12.4 years; body weight, 72 ± 21 kg; height, 169.2 ± 10.4 cm; body mass index, 24.9 ± 5.6) with a manual scan. Overall image quality (IQ; 5-point scale: 5, excellent; 1, poor) and coverage of the study volume were assessed by 2 readers for each sequence (coronal T2-weighted turbo inversion recovery magnitude [TIRM] and axial contrast-enhanced T1-weighted [ce-T1w] gradient dual-echo sequence). Interreader agreement was evaluated with intraclass correlation coefficients. Examination time, number of user interactions, and MR technicians' acceptance rating (1, highest; 10, lowest) was compared between both groups. Total examination time was significantly shorter for automated WB-MRI workflow versus manual WB-MRI workflow (30.0 ± 4.2 vs 41.5 ± 3.4 minutes, P < 0.0001) with significantly shorter planning time (2.5 ± 0.8 vs 14.0 ± 7.0 minutes, P < 0.0001). Planning took 8% of the total examination time with automated versus 34% with manual WB-MRI workflow (P < 0.0001). The number of user interactions with automated WB-MRI workflow was significantly lower compared with manual WB-MRI workflow (10.2 ± 4.4 vs 48.2 ± 17.2, P < 0.0001). Planning efforts were rated significantly lower by the MR technicians for the automated WB-MRI workflow than for the manual WB-MRI workflow (2.20 ± 0.92 vs 4.80 ± 2.39, respectively; P = 0.005). Overall IQ was similar between automated and manual WB-MRI workflow (TIRM: 4.00 ± 0.94 vs 3.45 ± 1.19, P = 0.264; ce-T1w: 4.20 ± 0.88 vs 4.55 ± .55, P = 0.423). Interreader agreement for overall IQ was excellent for TIRM and ce-T1w with an intraclass correlation coefficient of 0.95 (95% confidence interval, 0.86-0.98) and 0.88 (95% confidence interval, 0.70-0.95). Incomplete coverage of the thoracic compartment in the ce-T1w sequence occurred more often in the automated WB-MRI workflow (P = 0.008) for reader 2. No other significant differences in the study volume coverage were found. In conclusion, the automated WB-MRI scanner workflow showed a significant reduction of the examination time and the user interaction compared with the manual WB-MRI workflow. Image quality and the coverage of the study volume were comparable in both groups.

  4. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  5. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    PubMed

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  7. The Moderating Role of Gender in Shaping Entrepreneurial Intentions: Implications for Vocational Guidance

    ERIC Educational Resources Information Center

    Bagheri, Afsaneh; Pihie, Zaidatol Akmaliah Lope

    2014-01-01

    This study examines the relationships among attitude toward entrepreneurship, entrepreneurial self-efficacy, subjective norms, social valuation of entrepreneurship, and entrepreneurial intentions and how gender affects the relationships. Structural equation modeling was used to analyze the responses obtained from 719 Malaysian students across five…

  8. Improving adaptive/responsive signal control performance : implications of non-invasive detection and legacy timing practices : final report.

    DOT National Transportation Integrated Search

    2017-02-01

    This project collected and analyzed event based vehicle detection data from multiple technologies at four different sites across Oregon to provide guidance for deployment of non-invasive detection for use in adaptive control, as well as develop a tru...

  9. Longitudinal Mercury Monitoring Within the Japanese and Korean Communities (United States): Implications for Exposure Determination and Public Health Protection

    EPA Science Inventory

    Background: Estimates of exposure to toxicants are predominantly obtained from single timepoint data. Fishconsumption guidance based on these data may be incomplete as recommendations are unlikely to consider impact from factors such as intraindividual variability, seasonal dif...

  10. Workers' Experiences of RPL in South Africa: Some Implications for Redress, Equity and Transformation.

    ERIC Educational Resources Information Center

    Lugg, Rosie; Mabitla, Aubrey; Louw, Gordon; Angelis, Desi

    1998-01-01

    The Congress of South African Trade Unions sought to include prior learning assessment in education and training as a mechanism for redressing inequities. Issues that arose included worker representation in implementation, linkage to accreditation and training opportunities, accessibility, and support and guidance for workers. (SK)

  11. The Interpersonal Metafunction Analysis of Barack Obama's Victory Speech

    ERIC Educational Resources Information Center

    Ye, Ruijuan

    2010-01-01

    This paper carries on a tentative interpersonal metafunction analysis of Barack Obama's victory speech from the interpersonal metafunction, which aims to help readers understand and evaluate the speech regarding its suitability, thus to provide some guidance for readers to make better speeches. This study has promising implications for speeches as…

  12. Recent Case Law Regarding Functional Behavioral Assessments: Implications for Practice

    ERIC Educational Resources Information Center

    Losinski, Mickey L.; Katsiyannis, Antonis; Ryan, Joseph B.

    2014-01-01

    While functional behavioral assessments (FBAs) are currently federally mandated requirements, public schools have not been provided clear federal guidance concerning what constitutes an acceptable FBA through Individuals With Disabilities Education Act or related regulations. The purpose of this article is to examine recent rulings regarding FBAs…

  13. The "Good Enough" Parent: Implications for Child Protection

    ERIC Educational Resources Information Center

    Choate, Peter W.; Engstrom, Sandra

    2014-01-01

    Child protection workers must determine under what conditions a child should be sustained within the family system. A standard that is often referred to is "good enough" parenting or minimal parenting competence. Research and clinical literature fails to offer workers guidance on the practical application of this terminology. Such…

  14. Restructuring the Guidance Delivery System: Implications for High School Counselors.

    ERIC Educational Resources Information Center

    Greer, Richard M.; Richardson, Michael D.

    1992-01-01

    Notes that large portion of high school counselor's clientele, working parents, are not available during regular school hours. Suggests model program using flexible scheduling for high school counselors designed to address the issues of a changing clientele, a changing society, and changing expectation of counselors and schools. (NB)

  15. Text mining meets workflow: linking U-Compare with Taverna

    PubMed Central

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  16. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  17. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics

    PubMed Central

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe

    2015-01-01

    Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831

  18. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics.

    PubMed

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A; Caron, Christophe

    2015-05-01

    The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). contact@workflow4metabolomics.org. © The Author 2014. Published by Oxford University Press.

  19. A practical workflow for making anatomical atlases for biological research.

    PubMed

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  20. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  1. Separating Business Logic from Medical Knowledge in Digital Clinical Workflows Using Business Process Model and Notation and Arden Syntax.

    PubMed

    de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea

    2018-01-01

    Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.

  2. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging

    PubMed Central

    Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470

  3. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  4. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using "service casts" and "interest casts" (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH's Mining Workflow Composer and the open-source Active BPEL engine, and JPL's SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the "sociological" problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  5. Improving adherence to the Epic Beacon ambulatory workflow.

    PubMed

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  6. Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment

    PubMed Central

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan

    2016-01-01

    Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058

  7. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    PubMed

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  8. 78 FR 22880 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-17

    ... between Health IT and Ambulatory Care Workflow Redesign.'' In accordance with the Paperwork Reduction Act... Understand the Relationship between Health IT and Ambulatory Care Workflow Redesign. The Agency for... Methods to Better Understand the Relationship between Health IT and Ambulatory Care Workflow Redesign...

  9. Analysis of how people with intellectual disabilities organize information using computerized guidance.

    PubMed

    Lussier-Desrochers, Dany; Sauzéon, Hélène; Consel, Charles; Roux, Jeannie; Balland, Émilie; Godin-Tremblay, Valérie; N'Kaoua, Bernard; Lachapelle, Yves

    2017-04-01

    Access to residential settings for people with intellectual disabilities (ID) contributes to their social participation, but presents particular challenges. Assistive technologies can help people perform activities of daily living. However, the majority of the computerized solutions offered use guidance modes with a fixed, unchanging sequencing that leaves little room for self-determination to emerge. The objective of the project was to develop a flexible guidance mode and to test it with participants, to describe their information organization methods. This research used a descriptive exploratory design and conducted a comparison between five participants with ID and five participants with no ID. The results showed a difference in the information organization methods for both categories of participants. The people with ID used more diversified organization methods (categorical, schematic, action-directed) than the neurotypical participants (visual, action-directed). These organization methods varied depending on the people, but also on the characteristics of the requested task. Furthermore, several people with ID presented difficulties when switching from virtual to real mode. These results demonstrate the importance of developing flexible guidance modes adapted to the users' cognitive strategies, to maximize their benefits. Studies using experimental designs will have to be conducted to determine the impacts of more-flexible guidance modes Implications for rehabilitation Intervention approaches favouring, self-determination, decision making, action analysis and results anticipation must be promoted with people with intellectual disabilities. Fixed and rigid technological guidance mode, like those currently favoured in interventions, is appropriate for only some people's profiles or may depend on the nature of the task. It seems that people with ID use a wide spectrum of organization strategies and that adapting guidance modes to all these strategies is relevant.

  10. Military Interoperable Digital Hospital Testbed (MIDHT)

    DTIC Science & Technology

    2015-12-01

    and Analyze the resulting technological impact on medication errors, pharmacists ’ productivity, nurse satisfactions/workflow and patient...medication errors, pharmacists productivity, nurse satisfactions/workflow and patient satisfaction. 1.1.1 Pharmacy Robotics Implementation...1.2 Research and analyze the resulting technological impact on medication errors, pharmacist productivity, nurse satisfaction/workflow and patient

  11. Provenance Storage, Querying, and Visualization in PBase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kianmajd, Parisa; Ludascher, Bertram; Missier, Paolo

    2015-01-01

    We present PBase, a repository for scientific workflows and their corresponding provenance information that facilitates the sharing of experiments among the scientific community. PBase is interoperable since it uses ProvONE, a standard provenance model for scientific workflows. Workflows and traces are stored in RDF, and with the support of SPARQL and the tree cover encoding, the repository provides a scalable infrastructure for querying the provenance data. Furthermore, through its user interface, it is possible to: visualize workflows and execution traces; visualize reachability relations within these traces; issue SPARQL queries; and visualize query results.

  12. Context-aware workflow management of mobile health applications.

    PubMed

    Salden, Alfons; Poortinga, Remco

    2006-01-01

    We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.

  13. Experimental evaluation of a flexible I/O architecture for accelerating workflow engines in ultrascale environments

    DOE PAGES

    Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...

    2016-10-06

    The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less

  14. Prototype of Kepler Processing Workflows For Microscopy And Neuroinformatics

    PubMed Central

    Astakhov, V.; Bandrowski, A.; Gupta, A.; Kulungowski, A.W.; Grethe, J.S.; Bouwer, J.; Molina, T.; Rowley, V.; Penticoff, S.; Terada, M.; Wong, W.; Hakozaki, H.; Kwon, O.; Martone, M.E.; Ellisman, M.

    2016-01-01

    We report on progress of employing the Kepler workflow engine to prototype “end-to-end” application integration workflows that concern data coming from microscopes deployed at the National Center for Microscopy Imaging Research (NCMIR). This system is built upon the mature code base of the Cell Centered Database (CCDB) and integrated rule-oriented data system (IRODS) for distributed storage. It provides integration with external projects such as the Whole Brain Catalog (WBC) and Neuroscience Information Framework (NIF), which benefit from NCMIR data. We also report on specific workflows which spawn from main workflows and perform data fusion and orchestration of Web services specific for the NIF project. This “Brain data flow” presents a user with categorized information about sources that have information on various brain regions. PMID:28479932

  15. Workflow technology: the new frontier. How to overcome the barriers and join the future.

    PubMed

    Shefter, Susan M

    2006-01-01

    Hospitals are catching up to the business world in the introduction of technology systems that support professional practice and workflow. The field of case management is highly complex and interrelates with diverse groups in diverse locations. The last few years have seen the introduction of Workflow Technology Tools, which can improve the quality and efficiency of discharge planning by the case manager. Despite the availability of these wonderful new programs, many case managers are hesitant to adopt the new technology and workflow. For a myriad of reasons, a computer-based workflow system can seem like a brick wall. This article discusses, from a practitioner's point of view, how professionals can gain confidence and skill to get around the brick wall and join the future.

  16. An Integrated Framework for Parameter-based Optimization of Scientific Workflows.

    PubMed

    Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel

    2009-01-01

    Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.

  17. Workflow and Electronic Health Records in Small Medical Practices

    PubMed Central

    Ramaiah, Mala; Subrahmanian, Eswaran; Sriram, Ram D; Lide, Bettijoyce B

    2012-01-01

    This paper analyzes the workflow and implementation of electronic health record (EHR) systems across different functions in small physician offices. We characterize the differences in the offices based on the levels of computerization in terms of workflow, sources of time delay, and barriers to using EHR systems to support the entire workflow. The study was based on a combination of questionnaires, interviews, in situ observations, and data collection efforts. This study was not intended to be a full-scale time-and-motion study with precise measurements but was intended to provide an overview of the potential sources of delays while performing office tasks. The study follows an interpretive model of case studies rather than a large-sample statistical survey of practices. To identify time-consuming tasks, workflow maps were created based on the aggregated data from the offices. The results from the study show that specialty physicians are more favorable toward adopting EHR systems than primary care physicians are. The barriers to adoption of EHR systems by primary care physicians can be attributed to the complex workflows that exist in primary care physician offices, leading to nonstandardized workflow structures and practices. Also, primary care physicians would benefit more from EHR systems if the systems could interact with external entities. PMID:22737096

  18. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    PubMed

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  19. Opinion and Special Articles: Loan forgiveness options for young neurologists: Current landscape and practice implications.

    PubMed

    George, Benjamin P; Dorsey, E Ray; Grischkan, Justin A

    2017-04-11

    Increasing education debt has led to the availability of a variety of loan forgiveness options including the Department of Education's Public Service Loan Forgiveness (PSLF) program. This article discusses the current landscape of loan forgiveness options including trends in PSLF for rising neurology trainees, and implications for choices in specialization, employment, practice location, and the pursuit of an academic career. We further provide guidance on how to navigate the various loan forgiveness options that neurology residents and fellows may consider. © 2017 American Academy of Neurology.

  20. Asterism: an integrated, complete, and open-source approach for running seismologist continuous data-intensive analysis on heterogeneous systems

    NASA Astrophysics Data System (ADS)

    Ferreira da Silva, R.; Filgueira, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present Asterism, an open source data-intensive framework, which combines the Pegasus and dispel4py workflow systems. Asterism aims to simplify the effort required to develop data-intensive applications that run across multiple heterogeneous resources, without users having to: re-formulate their methods according to different enactment systems; manage the data distribution across systems; parallelize their methods; co-place and schedule their methods with computing resources; and store and transfer large/small volumes of data. Asterism's key element is to leverage the strengths of each workflow system: dispel4py allows developing scientific applications locally and then automatically parallelize and scale them on a wide range of HPC infrastructures with no changes to the application's code; Pegasus orchestrates the distributed execution of applications while providing portability, automated data management, recovery, debugging, and monitoring, without users needing to worry about the particulars of the target execution systems. Asterism leverages the level of abstractions provided by each workflow system to describe hybrid workflows where no information about the underlying infrastructure is required beforehand. The feasibility of Asterism has been evaluated using the seismic ambient noise cross-correlation application, a common data-intensive analysis pattern used by many seismologists. The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The Asterism workflow is implemented as a Pegasus workflow composed of two tasks (Phase1 and Phase2), where each phase represents a dispel4py workflow. Pegasus tasks describe the in/output data at a logical level, the data dependency between tasks, and the e-Infrastructures and the execution engine to run each dispel4py workflow. We have instantiated the workflow using data from 1000 stations from the IRIS services, and run it across two heterogeneous resources described as Docker containers: MPI (Container2) and Storm (Container3) clusters (Figure 1). Each dispel4py workflow is mapped to a particular execution engine, and data transfers between resources are automatically handled by Pegasus. Asterism is freely available online at http://github.com/dispel4py/pegasus_dispel4py.

  1. Notifiable condition reporting practices: implications for public health agency participation in a health information exchange.

    PubMed

    Revere, Debra; Hills, Rebecca H; Dixon, Brian E; Gibson, P Joseph; Grannis, Shaun J

    2017-03-11

    The future of notifiable condition reporting in the United States is undergoing a transformation with the increasing development of Health Information Exchanges which support electronic data-sharing and -transfer networks and the wider adoption of electronic laboratory reporting. Communicable disease report forms originating in clinics are an important source of surveillance data for public health agencies. However, problems of poor data quality and delayed submission of reports to public health agencies are common. In addition, studies of barriers and facilitators to reporting have assumed that the primary reporter is the treating physician, although the extent to which a provider is involved in the reporting workflow is unclear. We sought to better understand the barriers to and burden of notifiable condition reporting from the perspectives of the three primary groups involved in reporting workflow: providers, clinic staff who bear the principal responsibility for reporting, and the public health workers who receive and process reports from clinics. In addition, we sought to situate these findings within the context of the future of notifiable disease reporting and the potential impacts of electronic lab and medical records on the surveillance system. Seven ambulatory care clinics and 3 public health agencies that are part of a Health Information Exchange in the state of Indiana, USA, participated in the study. Data were obtained from a survey of clinic physicians (N = 29), interviews with clinic reporters (N = 11), and interviews with public health workers (N = 9). Survey data were summarized descriptively and interview transcripts underwent qualitative analysis. In both clinics and public health agencies, the laboratory report initiates reporting workflow. Provider involvement with reporting primarily revolves around ordering medications to treat a condition confirmed by the lab result. In clinics, reporting is typically the responsibility of clinic reporters who vary in frequency of reporting. We found an association between frequency of reporting, reporting knowledge and perceptions of reporting burden. In both clinics and public health agencies, interruptions and delays in reporting workflow are encountered due to inaccurate or missing information and impact reporting timeliness, data quality and report completeness. Both providers and clinic reporters lack clarity regarding how data submitted by their reports are used by public health agencies. It is possible that the value of reporting may be diminished when those responsible do not perceive receiving benefit in return. This may account for the low awareness of or recollection of public health communications with clinics that we observed. Despite the high likelihood that public health advisories and guidance are based, in part, on data submitted by clinics, a direct concordance may not be recognized. Unlike most studies of notifiable condition reporting, this study included the clinic reporters who bear primary responsibility for completing and submitting reports to public health agencies. A primary barrier to this reporting is timely and easy access to data. It is possible that expanded adoption of electronic health record and laboratory reporting systems will improve access to this data and reduce reporting the burden. However, a complete reliance on automatic electronic extraction of data requires caution and necessitates continued interfacing with clinic reporters for the foreseeable future-particularly for notifiable conditions that are high-impact, uncommon, prone to false positive readings by labs, or are hard to verify. An important finding of this study is the association between frequency of reporting, reporting knowledge and perceptions of reporting burden. Increased automation could result in even lower reporting knowledge and familiarity with reporting requirements which could actually increase reporters' perception of notifiable condition reporting as burdensome. Another finding was of uncertainty regarding how data sent to public health agencies is used or provides clinical benefit. A strong recommendation generated by these findings is that, given their central role in reporting, clinic reporters are a significant target audience for public health outreach and education that aims to alleviate perceived reporting burden and improve reporting knowledge. In particular, communicating the benefits of public health's use of the data may reduce a perceived lack of information reciprocity between clinical and public health organizations.

  2. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    NASA Astrophysics Data System (ADS)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  3. Quality Controlling CMIP datasets at GFDL

    NASA Astrophysics Data System (ADS)

    Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.

    2017-12-01

    As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.

  4. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  5. An automated analysis workflow for optimization of force-field parameters using neutron scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Vickie E.; Borreguero, Jose M.; Bhowmik, Debsindhu

    Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parametersmore » which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.« less

  6. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreher, M.; Peterka, T.

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steeringmore » based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.« less

  7. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  8. Structured recording of intraoperative surgical workflows

    NASA Astrophysics Data System (ADS)

    Neumuth, T.; Durstewitz, N.; Fischer, M.; Strauss, G.; Dietz, A.; Meixensberger, J.; Jannin, P.; Cleary, K.; Lemke, H. U.; Burgert, O.

    2006-03-01

    Surgical Workflows are used for the methodical and scientific analysis of surgical interventions. The approach described here is a step towards developing surgical assist systems based on Surgical Workflows and integrated control systems for the operating room of the future. This paper describes concepts and technologies for the acquisition of Surgical Workflows by monitoring surgical interventions and their presentation. Establishing systems which support the Surgical Workflow in operating rooms requires a multi-staged development process beginning with the description of these workflows. A formalized description of surgical interventions is needed to create a Surgical Workflow. This description can be used to analyze and evaluate surgical interventions in detail. We discuss the subdivision of surgical interventions into work steps regarding different levels of granularity and propose a recording scheme for the acquisition of manual surgical work steps from running interventions. To support the recording process during the intervention, we introduce a new software architecture. Core of the architecture is our Surgical Workflow editor that is intended to deal with the manifold, complex and concurrent relations during an intervention. Furthermore, a method for an automatic generation of graphs is shown which is able to display the recorded surgical work steps of the interventions. Finally we conclude with considerations about extensions of our recording scheme to close the gap to S-PACS systems. The approach was used to record 83 surgical interventions from 6 intervention types from 3 different surgical disciplines: ENT surgery, neurosurgery and interventional radiology. The interventions were recorded at the University Hospital Leipzig, Germany and at the Georgetown University Hospital, Washington, D.C., USA.

  9. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using “service casts” and “interest casts” (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH’s Mining Workflow Composer and the open-source Active BPEL engine, and JPL’s SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the “sociological” problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  10. Teacher Perceptions and Expectations of School Counselor Contributions: Implications for Program Planning and Training

    ERIC Educational Resources Information Center

    Clark, Mary Ann; Amatea, Ellen

    2004-01-01

    The researchers examined the perceptions of 23 teachers in elementary, middle, and high schools regarding necessary counseling and guidance services, how these services might best be delivered, and teachers' expectations about school counselor contributions and working relationships. The researchers also examined the resulting reflections of the…

  11. Implications of fire management on cultural resources [Chapter 9

    Treesearch

    Rebecca S. Timmons; Leonard deBano; Kevin C. Ryan

    2012-01-01

    Previous chapters in this synthesis have identified the important fuel, weather, and fire relationships associated with damage to cultural resources (CR). They have also identified the types of effects commonly encountered in various fire situations and provided some guidance on how to recognize damages and minimize their occurrence. This chapter describes planning...

  12. Legal and Ethical Implications of Working with Minors in Alabama: Consent and Confidentiality

    ERIC Educational Resources Information Center

    Keim, Michael A.; Cobia, Debra

    2010-01-01

    Until recently, there has been little guidance in the professional literature with respect to counseling minors outside of the school setting. Although most authors suggest referring to state statutes for legal limits of counseling practice, little research exists describing these requirements in Alabama. The purpose of this literature and…

  13. ACTIVITIES TO DEVELOP AN INTERIM GUIDANCE FOR MICROARRAY-BASED ASSAYS FOR REGULATORY AND RISK ASSESSMENT APPLICATIONS AT EPA

    EPA Science Inventory

    Abstract for presentation. Advances in genomics will have significant implications for risk assessment policies and regulatory decision making. In 2002, EPA issued its lnterim Policy on Genomics which stated that such data may be considered in the decision making process, but tha...

  14. Traditional Occupations in a Modern World: Implications for Career Guidance and Livelihood Planning

    ERIC Educational Resources Information Center

    Ratnam, Anita

    2011-01-01

    This article is an attempt to examine the place and significance of traditional occupations as careers in today's world. The areas of tension and compatibility between ideas and values that signify modernity and the practice of traditional occupations are reviewed. The meaning of "traditional occupations" is unravelled, the potential that…

  15. Common Core State Standards and Implications for Special Populations

    ERIC Educational Resources Information Center

    Best, Jane; Cohen, Courtney

    2013-01-01

    The goal of the Common Core State Standards is to address the academic needs of all students and prepare them for college and the workforce. Implementation guidance and professional training for teachers, particularly for those working with special populations of students, requires thoughtful consideration. While the Common Core allows teachers to…

  16. Learning Styles in Secondary Schools: A Review of Instruments and Implications for Their Use.

    ERIC Educational Resources Information Center

    Curry, Lynn

    Practitioner use of learning style theory and measures can have an impact on curriculum design, instruction and assessment methods, and student guidance in the secondary school. Concern about the "operationalization" of learning style continues due to confusion concerning definitions, weakness in reliability and validity of measurements, and…

  17. Supervision and Mentoring for Early Career School Psychologists: Availability, Access, Structure, and Implications

    ERIC Educational Resources Information Center

    Silva, Arlene E.; Newman, Daniel S.; Guiney, Meaghan C.; Valley-Gray, Sarah; Barrett, Courtenay A.

    2016-01-01

    The authors thank Jeffrey Charvat, Director of Research, National Association of School Psychologists (NASP), for his guidance regarding survey development and administration, and Wendy Finn, former Director of Membership and Marketing, NASP, for her assistance with sampling and data collection. The authors thank Concetta Panuccio for her…

  18. Parental Role Behaviors in Young, Dual Parent Families: Future Policy Implications.

    ERIC Educational Resources Information Center

    Dail, Paula W.

    An investigation was undertaken in an effort to further understand the role of young, traditional dual-parent families. A total of 249 parents were asked to complete the Iowa Parent Behavior Inventory. Responses were categorized into parent involvement, limit setting, responsivity, reasoned guidance, free expression (mothers only), and intimacy.…

  19. THE COMPUTER AS AN AID TO INSTRUCTION AND GUIDANCE IN THE SCHOOL.

    ERIC Educational Resources Information Center

    IMPELLITTERI, JOSEPH T.

    COMPUTER APPLICATIONS IN EDUCATION ARE DISCUSSED IN TERMS OF--(1) A DESCRIPTION OF COMPUTER-ASSISTED INSTRUCTION (CAI) AND COUNSELING, (2) THE NUMBER AND TYPES OF COMPUTER-ASSISTED DEVELOPMENTS, (3) THE NATURE OF THE PENN STATE UNIVERSITY PROGRAM, (4) TENTATIVE RESULTS OF EXPERIMENTATION USING CAI, AND (5) IMPLICATIONS AND PROJECTIONS FOR THE…

  20. Identifying, Analyzing, and Communicating Rural: A Quantitative Perspective

    ERIC Educational Resources Information Center

    Koziol, Natalie A.; Arthur, Ann M.; Hawley, Leslie R.; Bovaird, James A.; Bash, Kirstie L.; McCormick, Carina; Welch, Greg W.

    2015-01-01

    Defining rural is a critical task for rural education researchers, as it has implications for all phases of a study. However, it is also a difficult task due to the many ways in which rural can be theoretically, conceptually, and empirically operationalized. This article provides researchers with specific guidance on important theoretical and…

  1. What Can We Learn from School-Based Emotional Disturbance Assessment Practices? Implications for Practice and Preparation In School Psychology

    ERIC Educational Resources Information Center

    Allen, Ryan A.; Hanchon, Timothy A.

    2013-01-01

    The federal definition of emotional disturbance (ED) provides limited guidance to educational professionals charged with making Individuals with Disabilities in Education Improvement Act eligibility determinations. Despite calls to revise the definition, the ED category remains largely unchanged nearly four decades after being codified into…

  2. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  3. Development of a novel imaging informatics-based system with an intelligent workflow engine (IWEIS) to support imaging-based clinical trials

    PubMed Central

    Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J

    2015-01-01

    Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169

  4. Optimizing high performance computing workflow for protein functional annotation.

    PubMed

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  5. Optimizing high performance computing workflow for protein functional annotation

    PubMed Central

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-01-01

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296

  6. COSMOS: Python library for massively parallel workflows

    PubMed Central

    Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.

    2014-01-01

    Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428

  7. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  8. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  9. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  10. COSMOS: Python library for massively parallel workflows.

    PubMed

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  11. The impact of computerized provider order entry systems on inpatient clinical workflow: a literature review.

    PubMed

    Niazkhani, Zahra; Pirnejad, Habibollah; Berg, Marc; Aarts, Jos

    2009-01-01

    Previous studies have shown the importance of workflow issues in the implementation of CPOE systems and patient safety practices. To understand the impact of CPOE on clinical workflow, we developed a conceptual framework and conducted a literature search for CPOE evaluations between 1990 and June 2007. Fifty-one publications were identified that disclosed mixed effects of CPOE systems. Among the frequently reported workflow advantages were the legible orders, remote accessibility of the systems, and the shorter order turnaround times. Among the frequently reported disadvantages were the time-consuming and problematic user-system interactions, and the enforcement of a predefined relationship between clinical tasks and between providers. Regarding the diversity of findings in the literature, we conclude that more multi-method research is needed to explore CPOE's multidimensional and collective impact on especially collaborative workflow.

  12. Load-sensitive dynamic workflow re-orchestration and optimisation for faster patient healthcare.

    PubMed

    Meli, Christopher L; Khalil, Ibrahim; Tari, Zahir

    2014-01-01

    Hospital waiting times are considerably long, with no signs of reducing any-time soon. A number of factors including population growth, the ageing population and a lack of new infrastructure are expected to further exacerbate waiting times in the near future. In this work, we show how healthcare services can be modelled as queueing nodes, together with healthcare service workflows, such that these workflows can be optimised during execution in order to reduce patient waiting times. Services such as X-ray, computer tomography, and magnetic resonance imaging often form queues, thus, by taking into account the waiting times of each service, the workflow can be re-orchestrated and optimised. Experimental results indicate average waiting time reductions are achievable by optimising workflows using dynamic re-orchestration. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Scientific workflows as productivity tools for drug discovery.

    PubMed

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  14. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  15. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149

  16. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  17. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  18. Flexible Workflow Software enables the Management of an Increased Volume and Heterogeneity of Sensors, and evolves with the Expansion of Complex Ocean Observatory Infrastructures.

    NASA Astrophysics Data System (ADS)

    Tomlin, M. C.; Jenkyns, R.

    2015-12-01

    Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.

  19. Guidance and Control Software Project Data - Volume 1: Planning Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  20. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  1. Rethinking Clinical Workflow.

    PubMed

    Schlesinger, Joseph J; Burdick, Kendall; Baum, Sarah; Bellomy, Melissa; Mueller, Dorothee; MacDonald, Alistair; Chern, Alex; Chrouser, Kristin; Burger, Christie

    2018-03-01

    The concept of clinical workflow borrows from management and leadership principles outside of medicine. The only way to rethink clinical workflow is to understand the neuroscience principles that underlie attention and vigilance. With any implementation to improve practice, there are human factors that can promote or impede progress. Modulating the environment and working as a team to take care of patients is paramount. Clinicians must continually rethink clinical workflow, evaluate progress, and understand that other industries have something to offer. Then, novel approaches can be implemented to take the best care of patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Development of a 3D ultrasound-guided prostate biopsy system

    NASA Astrophysics Data System (ADS)

    Cool, Derek; Sherebrin, Shi; Izawa, Jonathan; Fenster, Aaron

    2007-03-01

    Biopsy of the prostate using ultrasound guidance is the clinical gold standard for diagnosis of prostate adenocarinoma. However, because early stage tumors are rarely visible under US, the procedure carries high false-negative rates and often patients require multiple biopsies before cancer is detected. To improve cancer detection, it is imperative that throughout the biopsy procedure, physicians know where they are within the prostate and where they have sampled during prior biopsies. The current biopsy procedure is limited to using only 2D ultrasound images to find and record target biopsy core sample sites. This information leaves ambiguity as the physician tries to interpret the 2D information and apply it to their 3D workspace. We have developed a 3D ultrasound-guided prostate biopsy system that provides 3D intra-biopsy information to physicians for needle guidance and biopsy location recording. The system is designed to conform to the workflow of the current prostate biopsy procedure, making it easier for clinical integration. In this paper, we describe the system design and validate its accuracy by performing an in vitro biopsy procedure on US/CT multi-modal patient-specific prostate phantoms. A clinical sextant biopsy was performed by a urologist on the phantoms and the 3D models of the prostates were generated with volume errors less than 4% and mean boundary errors of less than 1 mm. Using the 3D biopsy system, needles were guided to within 1.36 +/- 0.83 mm of 3D targets and the position of the biopsy sites were accurately localized to 1.06 +/- 0.89 mm for the two prostates.

  3. CBCT-based 3D MRA and angiographic image fusion and MRA image navigation for neuro interventions.

    PubMed

    Zhang, Qiang; Zhang, Zhiqiang; Yang, Jiakang; Sun, Qi; Luo, Yongchun; Shan, Tonghui; Zhang, Hao; Han, Jingfeng; Liang, Chunyang; Pan, Wenlong; Gu, Chuanqi; Mao, Gengsheng; Xu, Ruxiang

    2016-08-01

    Digital subtracted angiography (DSA) remains the gold standard for diagnosis of cerebral vascular diseases and provides intraprocedural guidance. This practice involves extensive usage of x-ray and iodinated contrast medium, which can induce side effects. In this study, we examined the accuracy of 3-dimensional (3D) registration of magnetic resonance angiography (MRA) and DSA imaging for cerebral vessels, and tested the feasibility of using preprocedural MRA for real-time guidance during endovascular procedures.Twenty-three patients with suspected intracranial arterial lesions were enrolled. The contrast medium-enhanced 3D DSA of target vessels were acquired in 19 patients during endovascular procedures, and the images were registered with preprocedural MRA for fusion accuracy evaluation. Low-dose noncontrasted 3D angiography of the skull was performed in the other 4 patients, and registered with the MRA. The MRA was overlaid afterwards with 2D live fluoroscopy to guide endovascular procedures.The 3D registration of the MRA and angiography demonstrated a high accuracy for vessel lesion visualization in all 19 patients examined. Moreover, MRA of the intracranial vessels, registered to the noncontrasted 3D angiography in the 4 patients, provided real-time 3D roadmap to successfully guide the endovascular procedures. Radiation dose to patients and contrast medium usage were shown to be significantly reduced.Three-dimensional MRA and angiography fusion can accurately generate cerebral vasculature images to guide endovascular procedures. The use of the fusion technology could enhance clinical workflow while minimizing contrast medium usage and radiation dose, and hence lowering procedure risks and increasing treatment safety.

  4. CBCT-based 3D MRA and angiographic image fusion and MRA image navigation for neuro interventions

    PubMed Central

    Zhang, Qiang; Zhang, Zhiqiang; Yang, Jiakang; Sun, Qi; Luo, Yongchun; Shan, Tonghui; Zhang, Hao; Han, Jingfeng; Liang, Chunyang; Pan, Wenlong; Gu, Chuanqi; Mao, Gengsheng; Xu, Ruxiang

    2016-01-01

    Abstract Digital subtracted angiography (DSA) remains the gold standard for diagnosis of cerebral vascular diseases and provides intraprocedural guidance. This practice involves extensive usage of x-ray and iodinated contrast medium, which can induce side effects. In this study, we examined the accuracy of 3-dimensional (3D) registration of magnetic resonance angiography (MRA) and DSA imaging for cerebral vessels, and tested the feasibility of using preprocedural MRA for real-time guidance during endovascular procedures. Twenty-three patients with suspected intracranial arterial lesions were enrolled. The contrast medium-enhanced 3D DSA of target vessels were acquired in 19 patients during endovascular procedures, and the images were registered with preprocedural MRA for fusion accuracy evaluation. Low-dose noncontrasted 3D angiography of the skull was performed in the other 4 patients, and registered with the MRA. The MRA was overlaid afterwards with 2D live fluoroscopy to guide endovascular procedures. The 3D registration of the MRA and angiography demonstrated a high accuracy for vessel lesion visualization in all 19 patients examined. Moreover, MRA of the intracranial vessels, registered to the noncontrasted 3D angiography in the 4 patients, provided real-time 3D roadmap to successfully guide the endovascular procedures. Radiation dose to patients and contrast medium usage were shown to be significantly reduced. Three-dimensional MRA and angiography fusion can accurately generate cerebral vasculature images to guide endovascular procedures. The use of the fusion technology could enhance clinical workflow while minimizing contrast medium usage and radiation dose, and hence lowering procedure risks and increasing treatment safety. PMID:27512846

  5. High-performance intraoperative cone-beam CT on a mobile C-arm: an integrated system for guidance of head and neck surgery

    NASA Astrophysics Data System (ADS)

    Siewerdsen, J. H.; Daly, M. J.; Chan, H.; Nithiananthan, S.; Hamming, N.; Brock, K. K.; Irish, J. C.

    2009-02-01

    A system for intraoperative cone-beam CT (CBCT) surgical guidance is under development and translation to trials in head and neck surgery. The system provides 3D image updates on demand with sub-millimeter spatial resolution and soft-tissue visibility at low radiation dose, thus overcoming conventional limitations associated with preoperative imaging alone. A prototype mobile C-arm provides the imaging platform, which has been integrated with several novel subsystems for streamlined implementation in the OR, including: real-time tracking of surgical instruments and endoscopy (with automatic registration of image and world reference frames); fast 3D deformable image registration (a newly developed multi-scale Demons algorithm); 3D planning and definition of target and normal structures; and registration / visualization of intraoperative CBCT with the surgical plan, preoperative images, and endoscopic video. Quantitative evaluation of surgical performance demonstrates a significant advantage in achieving complete tumor excision in challenging sinus and skull base ablation tasks. The ability to visualize the surgical plan in the context of intraoperative image data delineating residual tumor and neighboring critical structures presents a significant advantage to surgical performance and evaluation of the surgical product. The system has been translated to a prospective trial involving 12 patients undergoing head and neck surgery - the first implementation of the research prototype in the clinical setting. The trial demonstrates the value of high-performance intraoperative 3D imaging and provides a valuable basis for human factors analysis and workflow studies that will greatly augment streamlined implementation of such systems in complex OR environments.

  6. MO-D-BRB-02: SBRT Treatment Planning and Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y.

    2016-06-15

    Increased use of SBRT and hypofractionation in radiation oncology practice has posted a number of challenges to medical physicist, ranging from planning, image-guided patient setup and on-treatment monitoring, to quality assurance (QA) and dose delivery. This symposium is designed to provide current knowledge necessary for the safe and efficient implementation of SBRT in various linac platforms, including the emerging digital linacs equipped with high dose rate FFF beams. Issues related to 4D CT, PET and MRI simulations, 3D/4D CBCT guided patient setup, real-time image guidance during SBRT dose delivery using gated/un-gated VMAT/IMRT, and technical advancements in QA of SBRT (inmore » particular, strategies dealing with high dose rate FFF beams) will be addressed. The symposium will help the attendees to gain a comprehensive understanding of the SBRT workflow and facilitate their clinical implementation of the state-of-art imaging and planning techniques. Learning Objectives: Present background knowledge of SBRT, describe essential requirements for safe implementation of SBRT, and discuss issues specific to SBRT treatment planning and QA. Update on the use of multi-dimensional and multi-modality imaging for reliable guidance of SBRT. Discuss treatment planning and QA issues specific to SBRT. Provide a comprehensive overview of emerging digital linacs and summarize the key geometric and dosimetric features of the new generation of linacs for substantially improved SBRT. NIH/NCI; Varian Medical Systems; F. Yin, Duke University has a research agreement with Varian Medical Systems. In addition to research grant, I had a technology license agreement with Varian Medical Systems.« less

  7. A cognitive task analysis of a visual analytic workflow: Exploring molecular interaction networks in systems biology.

    PubMed

    Mirel, Barbara; Eichinger, Felix; Keller, Benjamin J; Kretzler, Matthias

    2011-03-21

    Bioinformatics visualization tools are often not robust enough to support biomedical specialists’ complex exploratory analyses. Tools need to accommodate the workflows that scientists actually perform for specific translational research questions. To understand and model one of these workflows, we conducted a case-based, cognitive task analysis of a biomedical specialist’s exploratory workflow for the question: What functional interactions among gene products of high throughput expression data suggest previously unknown mechanisms of a disease? From our cognitive task analysis four complementary representations of the targeted workflow were developed. They include: usage scenarios, flow diagrams, a cognitive task taxonomy, and a mapping between cognitive tasks and user-centered visualization requirements. The representations capture the flows of cognitive tasks that led a biomedical specialist to inferences critical to hypothesizing. We created representations at levels of detail that could strategically guide visualization development, and we confirmed this by making a trial prototype based on user requirements for a small portion of the workflow. Our results imply that visualizations should make available to scientific users “bundles of features” consonant with the compositional cognitive tasks purposefully enacted at specific points in the workflow. We also highlight certain aspects of visualizations that: (a) need more built-in flexibility; (b) are critical for negotiating meaning; and (c) are necessary for essential metacognitive support.

  8. Managing and Communicating Operational Workflow

    PubMed Central

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  9. Next-generation sequencing meets genetic diagnostics: development of a comprehensive workflow for the analysis of BRCA1 and BRCA2 genes

    PubMed Central

    Feliubadaló, Lídia; Lopez-Doriga, Adriana; Castellsagué, Ester; del Valle, Jesús; Menéndez, Mireia; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Gómez, Carolina; Campos, Olga; Pineda, Marta; González, Sara; Moreno, Victor; Brunet, Joan; Blanco, Ignacio; Serra, Eduard; Capellá, Gabriel; Lázaro, Conxi

    2013-01-01

    Next-generation sequencing (NGS) is changing genetic diagnosis due to its huge sequencing capacity and cost-effectiveness. The aim of this study was to develop an NGS-based workflow for routine diagnostics for hereditary breast and ovarian cancer syndrome (HBOCS), to improve genetic testing for BRCA1 and BRCA2. A NGS-based workflow was designed using BRCA MASTR kit amplicon libraries followed by GS Junior pyrosequencing. Data analysis combined Variant Identification Pipeline freely available software and ad hoc R scripts, including a cascade of filters to generate coverage and variant calling reports. A BRCA homopolymer assay was performed in parallel. A research scheme was designed in two parts. A Training Set of 28 DNA samples containing 23 unique pathogenic mutations and 213 other variants (33 unique) was used. The workflow was validated in a set of 14 samples from HBOCS families in parallel with the current diagnostic workflow (Validation Set). The NGS-based workflow developed permitted the identification of all pathogenic mutations and genetic variants, including those located in or close to homopolymers. The use of NGS for detecting copy-number alterations was also investigated. The workflow meets the sensitivity and specificity requirements for the genetic diagnosis of HBOCS and improves on the cost-effectiveness of current approaches. PMID:23249957

  10. Using implementation tools to design and conduct quality improvement projects for faster and more effective improvement.

    PubMed

    Ovretveit, John; Mittman, Brian; Rubenstein, Lisa; Ganz, David A

    2017-10-09

    Purpose The purpose of this paper is to enable improvers to use recent knowledge from implementation science to carry out improvement changes more effectively. It also highlights the importance of converting research findings into practical tools and guidance for improvers so as to make research easier to apply in practice. Design/methodology/approach This study provides an illustration of how a quality improvement (QI) team project can make use of recent findings from implementation research so as to make their improvement changes more effective and sustainable. The guidance is based on a review and synthesis of improvement and implementation methods. Findings The paper illustrates how research can help a quality project team in the phases of problem definition and preparation, in design and planning, in implementation, and in sustaining and spreading a QI. Examples of the use of different ideas and methods are cited where they exist. Research limitations/implications The example is illustrative and there is little limited experimental evidence of whether using all the steps and tools in the one approach proposed do enable a quality team to be more effective. Evidence supporting individual guidance proposals is cited where it exists. Practical implications If the steps proposed and illustrated in the paper were followed, it is possible that quality projects could avoid waste by ensuring the conditions they need for success are in place, and sustain and spread improvement changes more effectively. Social implications More patients could benefit more quickly from more effective implementation of proven interventions. Originality/value The paper is the first to describe how improvement and implementation science can be combined in a tangible way that practical improvers can use in their projects. It shows how QI project teams can take advantage of recent advances in improvement and implementation science to make their work more effective and sustainable.

  11. Neogenin mediates the action of repulsive guidance molecule.

    PubMed

    Rajagopalan, Srikanth; Deitinghoff, Lutz; Davis, Denise; Conrad, Sabine; Skutella, Thomas; Chedotal, Alain; Mueller, Bernhard K; Strittmatter, Stephen M

    2004-08-01

    Repulsive guidance molecule (RGM) is a recently identified protein implicated in both axonal guidance and neural tube closure. The avoidance of chick RGM in the posterior optic tectum by growing temporal, but not nasal, retinal ganglion cell axons is thought to contribute to visual map formation. In contrast to ephrins, semaphorins, netrins and slits, no receptor mechanism for RGM action has been defined. Here, an expression cloning strategy identified neogenin as a binding site for RGM, with a sub-nanomolar affinity. Consistent with selective axonal responsiveness to RGM, neogenin is expressed in a gradient across the chick retina. Neogenin is known to be one of several netrin-binding proteins but only neogenin interacts with RGM. The avoidance of RGM by temporal retinal axons is blocked by the anti-neogenin antibody and the soluble neogenin ectodomain. Dorsal root ganglion axons are unresponsive to RGM but are converted to a responsive state by neogenin expression. Thus, neogenin functions as an RGM receptor.

  12. Alcohol and sexual health in young people: the role of PSHE.

    PubMed

    Rowlinson, Louise

    2014-12-01

    This paper explores the relationship between sexual health and alcohol in young people in contemporary society, and the role of personal, social and health and economic education (PSHE). This research was prompted by the decision of the Department of Health (DH) not to publish National Institute for Health and Care Excellence (NICE) guidance on PSHE in January 2011. The guidance was requested following a Department for Education internal review into PSHE education. This paper will review qualitative and quantitative research, and data pertaining to the issue of sexual health behaviour and alcohol use among young people in the UK and the role of PSHE education. NICE guidance remains the 'gold standard' for evidence-based healthcare service provision and its implications for sexually transmitted infection and teenage pregnancy rates remains a high priority. Equally, research supports that addressing the issue of alcohol is an increasing priority in young people. This paper will argue that the NICE PSHE review findings should be updated, published and implemented.

  13. Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.

  14. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    PubMed

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Barriers to effective, safe communication and workflow between nurses and non-consultant hospital doctors during out-of-hours.

    PubMed

    Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh

    2017-11-01

    This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  16. Widening the adoption of workflows to include human and human-machine scientific processes

    NASA Astrophysics Data System (ADS)

    Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.

    2010-12-01

    Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.

  17. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  18. Knowledge Extraction and Semantic Annotation of Text from the Encyclopedia of Life

    PubMed Central

    Thessen, Anne E.; Parr, Cynthia Sims

    2014-01-01

    Numerous digitization and ontological initiatives have focused on translating biological knowledge from narrative text to machine-readable formats. In this paper, we describe two workflows for knowledge extraction and semantic annotation of text data objects featured in an online biodiversity aggregator, the Encyclopedia of Life. One workflow tags text with DBpedia URIs based on keywords. Another workflow finds taxon names in text using GNRD for the purpose of building a species association network. Both workflows work well: the annotation workflow has an F1 Score of 0.941 and the association algorithm has an F1 Score of 0.885. Existing text annotators such as Terminizer and DBpedia Spotlight performed well, but require some optimization to be useful in the ecology and evolution domain. Important future work includes scaling up and improving accuracy through the use of distributional semantics. PMID:24594988

  19. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  20. Clinical Alarms in intensive care: implications of alarm fatigue for the safety of patients1

    PubMed Central

    Bridi, Adriana Carla; Louro, Thiago Quinellato; da Silva, Roberto Carlos Lyra

    2014-01-01

    OBJECTIVES: to identify the number of electro-medical pieces of equipment in a coronary care unit, characterize their types, and analyze implications for the safety of patients from the perspective of alarm fatigue. METHOD: this quantitative, observational, descriptive, non-participatory study was conducted in a coronary care unit of a cardiology hospital with 170 beds. RESULTS: a total of 426 alarms were recorded in 40 hours of observation: 227 were triggered by multi-parametric monitors and 199 were triggered by other equipment (infusion pumps, dialysis pumps, mechanical ventilators, and intra-aortic balloons); that is an average of 10.6 alarms per hour. CONCLUSION: the results reinforce the importance of properly configuring physiological variables, the volume and parameters of alarms of multi-parametric monitors within the routine of intensive care units. The alarms of equipment intended to protect patients have increased noise within the unit, the level of distraction and interruptions in the workflow, leading to a false sense of security. PMID:25591100

  1. Independent signaling by Drosophila insulin receptor for axon guidance and growth

    PubMed Central

    Li, Caroline R.; Guo, Dongyu; Pick, Leslie

    2014-01-01

    The Drosophila insulin receptor (DInR) regulates a diverse array of biological processes including growth, axon guidance, and sugar homeostasis. Growth regulation by DInR is mediated by Chico, the Drosophila homolog of vertebrate insulin receptor substrate proteins IRS1–4. In contrast, DInR regulation of photoreceptor axon guidance in the developing visual system is mediated by the SH2-SH3 domain adaptor protein Dreadlocks (Dock). In vitro studies by others identified five NPXY motifs, one in the juxtamembrane region and four in the signaling C-terminal tail (C-tail), important for interaction with Chico. Here we used yeast two-hybrid assays to identify regions in the DInR C-tail that interact with Dock. These Dock binding sites were in separate portions of the C-tail from the previously identified Chico binding sites. To test whether these sites are required for growth or axon guidance in whole animals, a panel of DInR proteins, in which the putative Chico and Dock interaction sites had been mutated individually or in combination, were tested for their ability to rescue viability, growth and axon guidance defects of dinr mutant flies. Sites required for viability were identified. Unexpectedly, mutation of both putative Dock binding sites, either individually or in combination, did not lead to defects in photoreceptor axon guidance. Thus, either sites also required for viability are necessary for DInR function in axon guidance and/or there is redundancy built into the DInR/Dock interaction such that Dock is able to interact with multiple regions of DInR. We also found that simultaneous mutation of all five NPXY motifs implicated in Chico interaction drastically decreased growth in both male and female adult flies. These animals resembled chico mutants, supporting the notion that DInR interacts directly with Chico in vivo to control body size. Mutation of these five NPXY motifs did not affect photoreceptor axon guidance, segregating the roles of DInR in the processes of growth and axon guidance. PMID:24478707

  2. A Six‐Stage Workflow for Robust Application of Systems Pharmacology

    PubMed Central

    Gadkar, K; Kirouac, DC; Mager, DE; van der Graaf, PH

    2016-01-01

    Quantitative and systems pharmacology (QSP) is increasingly being applied in pharmaceutical research and development. One factor critical to the ultimate success of QSP is the establishment of commonly accepted language, technical criteria, and workflows. We propose an integrated workflow that bridges conceptual objectives with underlying technical detail to support the execution, communication, and evaluation of QSP projects. PMID:27299936

  3. Using Workflow Diagrams to Address Hand Hygiene in Pediatric Long-Term Care Facilities1

    PubMed Central

    Carter, Eileen J.; Cohen, Bevin; Murray, Meghan T.; Saiman, Lisa; Larson, Elaine L.

    2015-01-01

    Hand hygiene (HH) in pediatric long-term care settings has been found to be sub-optimal. Multidisciplinary teams at three pediatric long-term care facilities developed step-by-step workflow diagrams of commonly performed tasks highlighting HH opportunities. Diagrams were validated through observation of tasks and concurrent diagram assessment. Facility teams developed six workflow diagrams that underwent 22 validation observations. Four main themes emerged: 1) diagram specificity, 2) wording and layout, 3) timing of HH indications, and 4) environmental hygiene. The development of workflow diagrams is an opportunity to identify and address the complexity of HH in pediatric long-term care facilities. PMID:25773517

  4. High-volume workflow management in the ITN/FBI system

    NASA Astrophysics Data System (ADS)

    Paulson, Thomas L.

    1997-02-01

    The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.

  5. Socio-technical systems and interaction design - 21st century relevance.

    PubMed

    Maguire, Martin

    2014-03-01

    This paper focuses on the relationship between the socio-technical system and the user-technology interface. It looks at specific aspects of the organisational context such as multiple user roles, job change, work processes and workflows, technical infrastructure, and the challenges they present for the interaction designer. The implications of trends such as more mobile and flexible working, the use of social media, and the growth of the virtual organisation, are also considered. The paper also reviews rapidly evolving technologies such as pervasive systems and artificial intelligence, and the skills that workers will need to engage with them. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Radiography for intensive care: participatory process analysis in a PACS-equipped and film/screen environment

    NASA Astrophysics Data System (ADS)

    Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes

    2002-05-01

    If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.

  7. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  8. Development of the workflow kine systems for support on KAIZEN.

    PubMed

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  9. [Integration of the radiotherapy irradiation planning in the digital workflow].

    PubMed

    Röhner, F; Schmucker, M; Henne, K; Momm, F; Bruggmoser, G; Grosu, A-L; Frommhold, H; Heinemann, F E

    2013-02-01

    At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority.

  10. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. 75 FR 43922 - Interim Guidance for Determining Subject Matter Eligibility for Process Claims in View of Bilski...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ..., who do not routinely encounter claims that implicate the abstract idea exception. Under the principles... principles: Laws of nature, physical phenomena, and abstract ideas. See id. The Office has been using the so... marketing a product, comprising: Developing a shared marketing force, said shared marketing force including...

  12. Building Social Support for Adolescents with Suicidal Ideation: Implications for School Guidance and Counselling

    ERIC Educational Resources Information Center

    Sun, Rachel C. F.; Hui, Eadaoin K. P.

    2007-01-01

    This study involved interviews with 13 adolescents with high levels of suicidal ideation. It aimed to understand how these adolescents perceived their family, school and peer relationships, and how they perceived these systems as their support. Comparison between adolescents with severe and mild suicidal ideation showed that the family, school and…

  13. An Existentialist in Iqaluit: Existentialism and Reflexivity Informing Pedagogy in the Canadian North

    ERIC Educational Resources Information Center

    Yue, Anthony R.

    2011-01-01

    Reflecting on the personal experience of teaching human resource management in the Canadian Arctic, the author explores the utility of an existentialist approach to pedagogy. The author outlines select aspects of existentialism that are pertinent to the teaching and discusses the implications of using reflexive existential thought as guidance in a…

  14. Results from an Exploratory Study of Sun Protection Practice: Implications for the Design of Health Promotion Messages

    ERIC Educational Resources Information Center

    Eadie, Douglas; MacAskill, Susan

    2007-01-01

    Purpose: The primary aim of the research reported here is to provide strategic guidance for the development of a national communication strategy to improve sun protection practice amongst young people. Design/methodology/approach: The research adopted an exploratory approach, employing qualitative focus groups to represent three population groups,…

  15. Factors Affecting the Impact of Teacher Education Programmes on Teacher Preparedness: Implications for Accreditation Policy

    ERIC Educational Resources Information Center

    Ingvarson, Lawrence; Beavis, Adrian; Kleinhenz, Elizabeth

    2007-01-01

    The purpose of this study was to provide guidance to policy-makers about the standards that might be appropriate for accrediting teacher education programmes. The study was commissioned by the Victorian Institute of Teaching (VIT), a statutory body established in 2001 by the Victorian state government with responsibility for the registration…

  16. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.

  17. Ab initio chemical safety assessment: A workflow based on exposure considerations and non-animal methods.

    PubMed

    Berggren, Elisabet; White, Andrew; Ouedraogo, Gladys; Paini, Alicia; Richarz, Andrea-Nicole; Bois, Frederic Y; Exner, Thomas; Leite, Sofia; Grunsven, Leo A van; Worth, Andrew; Mahony, Catherine

    2017-11-01

    We describe and illustrate a workflow for chemical safety assessment that completely avoids animal testing. The workflow, which was developed within the SEURAT-1 initiative, is designed to be applicable to cosmetic ingredients as well as to other types of chemicals, e.g. active ingredients in plant protection products, biocides or pharmaceuticals. The aim of this work was to develop a workflow to assess chemical safety without relying on any animal testing, but instead constructing a hypothesis based on existing data, in silico modelling, biokinetic considerations and then by targeted non-animal testing. For illustrative purposes, we consider a hypothetical new ingredient x as a new component in a body lotion formulation. The workflow is divided into tiers in which points of departure are established through in vitro testing and in silico prediction, as the basis for estimating a safe external dose in a repeated use scenario. The workflow includes a series of possible exit (decision) points, with increasing levels of confidence, based on the sequential application of the Threshold of Toxicological (TTC) approach, read-across, followed by an "ab initio" assessment, in which chemical safety is determined entirely by new in vitro testing and in vitro to in vivo extrapolation by means of mathematical modelling. We believe that this workflow could be applied as a tool to inform targeted and toxicologically relevant in vitro testing, where necessary, and to gain confidence in safety decision making without the need for animal testing.

  18. Digital disruption ?syndromes.

    PubMed

    Sullivan, Clair; Staib, Andrew

    2017-05-18

    The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and operational forms of digital disruption which lead us to propose some digital disruption 'syndromes'. The definition and management of these 'syndromes' are discussed in detail. What are the implications for practitioners? Minimising the temporary effects of digital disruption in hospitals requires an understanding that these digital 'syndromes' are to be expected and actively managed during large-scale transformation.

  19. Innovative tidal notch detection using TLS and fuzzy logic: Implications for palaeo-shorelines from compressional (Crete) and extensional (Gulf of Corinth) tectonic settings

    NASA Astrophysics Data System (ADS)

    Schneiderwind, S.; Boulton, S. J.; Papanikolaou, I.; Reicherter, K.

    2017-04-01

    Tidal notches are a generally accepted sea-level marker and maintain particular interest for palaeoseismic studies since coastal seismic activity potentially displaces them from their genetic position. The result of subsequent seismic events is a notch sequence reflecting the cumulative coastal uplift. In order to evaluate preserved notch sequences, an innovative and interdisciplinary workflow is presented that accurately highlights evidence for palaeo-sea-level markers. The workflow uses data from terrestrial laser scanning and iteratively combines high-resolution curvature analysis, high performance edge detection, and feature extraction. Based on the assumptions that remnants, such as the roof of tidal notches, form convex patterns, edge detection is performed on principal curvature images. In addition, a standard algorithm is compared to edge detection results from a custom Fuzzy logic approach. The results pass through a Hough transform in order to extract continuous line features of an almost horizontal orientation. The workflow was initially developed on a single, distinct, and sheltered exposure in southern Crete and afterwards successfully tested on laser scans of different coastal cliffs from the Perachora Peninsula. This approach allows a detailed examination of otherwise inaccessible locations and the evaluation of lateral and 3D geometries, thus evidence for previously unrecognised sea-level markers can be identified even when poorly developed. High resolution laser scans of entire cliff exposures allow local variations to be quantified. Edge detection aims to reduce information on the surface curvature and Hough transform limits the results towards orientation and continuity. Thus, the presented objective methodology enhances the recognition of tidal notches and supports palaeoseismic studies by contributing spatial information and accurate measurements of horizontal movements, beyond that recognised during traditional surveys. This is especially useful for the identification of palaeo-shorelines in extensional tectonic environments where coseismic footwall uplift (only 1/2 to 1/4 of net slip per event) is unlikely to raise an entire notch above the tidal range.

  20. WE-DE-207A-02: Advances in Cone Beam CT Anatomical and Functional Imaging in Angio-Suite to Enable One-Stop-Shop Stroke Imaging Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G.

    1. Parallels in the evolution of x-ray angiographic systems and devices used for minimally invasive endovascular therapy Charles Strother - DSA, invented by Dr. Charles Mistretta at UW-Madison, was the technology which enabled the development of minimally invasive endovascular procedures. As DSA became widely available and the potential benefits for accessing the cerebral vasculature from an endovascular approach began to be apparent, industry began efforts to develop tools for use in these procedures. Along with development of catheters, embolic materials, pushable coils and the GDC coils there was simultaneous development and improvement of 2D DSA image quality and the introductionmore » of 3D DSA. Together, these advances resulted in an enormous expansion in the scope and numbers of minimally invasive endovascular procedures. The introduction of flat detectors for c-arm angiographic systems in 2002 provided the possibility of the angiographic suite becoming not just a location for vascular imaging where physiological assessments might also be performed. Over the last decade algorithmic and hardware advances have been sufficient to now realize this potential in clinical practice. The selection of patients for endovascular treatments is enhanced by this dual capability. Along with these advances has been a steady reduction in the radiation exposure required so that today, vascular and soft tissue images may be obtained with equal or in many cases less radiation exposure than is the case for comparable images obtained with multi-detector CT. Learning Objectives: To understand the full capabilities of today’s angiographic suite To understand how c-arm cone beam CT soft tissue imaging can be used for assessments of devices, blood flow and perfusion. Advances in real-time x-ray neuro-endovascular image guidance Stephen Rudin - Reacting to the demands on real-time image guidance for ever finer neurovascular interventions, great improvements in imaging chains are being pursued. For the highest spatial and temporal resolution, x-ray guidance with fluoroscopy and angiography although dominant are still being vastly improved. New detectors such as the Micro-Angiographic Fluoroscope (MAF) and x-ray source designs that enable higher outputs while maintaining small focal spots will be highlighted along with new methods for minimizing the radiation dose to patients. Additionally, new platforms for training and device testing that include patient-specific 3D printed vascular phantoms and new metrics such as generalized relative object detectability for objectively inter-comparing systems will be discussed. This will improve the opportunity for better evaluation of these technological advances which should contribute to the safety and efficacy of image guided minimally invasive neuro-endovascular procedures. Learning Objectives: To understand the operation of new x-ray imaging chain components such as detectors and sources To be informed about the latest testing methods, with 3D printed vascular phantoms, and new evaluation metrics for advanced imaging in x-ray image guided neurovascular interventions Advances in cone beam CT anatomical and functional imaging in angio-suite to enable one-stop-shop stroke imaging workflow Guang-Hong Chen - The introduction of flat-panel detector based cone-beam CT in clinical angiographic imaging systems enabled treating physicians to obtain three-dimensional anatomic roadmaps for bony structure, soft brain tissue, and vasculatures for treatment planning and efficacy checking after the procedures. However, much improvement is needed to reduce image artifacts, reduce radiation dose, and add potential functional imaging capability to provide four-dimensional dynamic information of vasculature and brain perfusion. In this presentation, some of the new techniques developed to address radiation dose issues, image artifact reduction and brain perfusion using C-arm cone-beam CT imaging system will be introduced for the audience. Learning Objectives: To understand the clinical need of one-stop-shop stroke imaging workflow To understand to technical challenges in cone beam CT perfusion To understand the potential technical solutions to enable one-stop-shop imaging workflow Recent advances in devices used in neuro--interventions Mattew Gounis - Over the past two decades, there has been explosive development of medical devices that have revolutionized the endovascular treatment of cerebrovascular diseases. There is now Level 1, Class A evidence that intra-arterial, mechanical thrombectomy in acute ischemic stroke is superior to medical management; and similarly that minimally invasive, endovascular repair of ruptured brain aneurysms is superior to surgical treatment. Stent-retrievers are now standard of care for emergent large vessel occlusions causing a stroke, with a number of patients need to treat for good clinical outcomes as low as 4. Recent technologies such as flow diverters and disrupters, intracranial self-expanding stents, flexible large bore catheters that can reach vessels beyond the circle of Willis, stent-retrievers, and super-compliant balloons are the result of successful miniaturization of design features and novel manufacturing technologies capable of building these devices. This is a rapidly evolving field, and the device technology enabling such advancements will be reviewed. Importantly, image-guidance technology has not kept pace in neurointervention and the ability to adequately characterize these devices in vivo remains a significant opportunity. Learning Objectives: A survey of devices used in neurointerventions, their materials and essential design characteristics Funding support received from NIH and DOD; Funding support received from GE Healthcare; Funding support received from Siemens AX; Patent royalties received from GE Healthcare; G. Chen, Funding received from NIH; funding received from DOD; funding received from GE Healthcare; funding received from Siemens AX.; M. Gounis, consultant for Codman Neurovascular and Stryker Neurovascular; Holds stock in InNeuroCo Inc, research grants: NIH, Medtronic Neurovascular, Microvention/Terumo, Cerevasc LLC, Gentuity, Codman Neurovascular, Philips Healthcare, Stryker Neurovascular, Tay Sachs Foundation, and InNeuroCo Inc.; S. Rudin, Supported in part by NIH Grant R01EB002873 and the Toshiba Medical System Corp.« less

  1. Guidance and Control Software Project Data - Volume 2: Development Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  2. Build and Execute Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Qiang

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows willmore » compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less

  3. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    PubMed

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most efficient analysis of soybean data using thorough testing and validation. This research serves as an example of best practices for development of genomics data analysis workflows by integrating remote HPC resources and efficient data management with ease of use for biological users. PGen workflow can also be easily customized for analysis of data in other species.

  4. Distributed Data Integration Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, T; Ludaescher, B; Vouk, M

    The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This can result in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. A related issue is keeping up with current trends in information technology often taxes the end-user's expertise and time. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and technologies, usemore » them almost exclusively, and develop a resistance to innovations that can enhance their productivity. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information and the latest technologies. In order to address this problem we are developing a end-user centric, domain-sensitive workflow-based infrastructure, shown in Figure 1, that will allow scientists to design complex scientific workflows that reflect the data manipulation required to perform their research without an undue burden. We are taking a three-tiered approach to designing this infrastructure utilizing (1) abstract workflow definition, construction, and automatic deployment, (2) complex agent-based workflow execution and (3) automatic wrapper generation. In order to construct a workflow, the scientist defines an abstract workflow (AWF) in terminology (semantics and context) that is familiar to him/her. This AWF includes all of the data transformations, selections, and analyses required by the scientist, but does not necessarily specify particular data sources. This abstract workflow is then compiled into an executable workflow (EWF, in our case XPDL) that is then evaluated and executed by the workflow engine. This EWF contains references to specific data source and interfaces capable of performing the desired actions. In order to provide access to the largest number of resources possible, our lowest level utilizes automatic wrapper generation techniques to create information and data wrappers capable of interacting with the complex interfaces typical in scientific analysis. The remainder of this document outlines our work in these three areas, the impact our work has made, and our plans for the future.« less

  5. Tanzanian Adolescent Boys’ Transitions Through Puberty: The Importance of Context

    PubMed Central

    Likindikoki, Samuel; Kaaya, S.

    2014-01-01

    We explored the masculinity norms shaping transitions through puberty in rural and urban Tanzania and how these norms and their social-ecological context contribute to high-risk health behaviors. We conducted a qualitative case study of adolescent boys in and out of school in 2011 and 2012. Tanzania’s social and economic development is reshaping the transition into young manhood. Adolescent boys are losing traditional mechanisms of pubertal guidance, and new meanings of manhood are arising from globalization. Traditional masculinity norms, including pressures to demonstrate virility and fertility, remain strong. Adolescent boys in modernizing Tanzania receive inadequate guidance on their burgeoning sexuality. Contradictory masculinity norms from family and society are shaping their sexual expectations, with implications for their engagement in unsafe sexual behaviors. PMID:25320893

  6. Conducting systematic reviews of economic evaluations.

    PubMed

    Gomersall, Judith Streak; Jadotte, Yuri Tertilus; Xue, Yifan; Lockwood, Suzi; Riddle, Dru; Preda, Alin

    2015-09-01

    In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. The objective is to present the outcomes of the working group. The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other types of evidence. The updated JBI guidance will be useful for researchers wanting to synthesize evidence about economic questions, either as stand-alone reviews or part of comprehensive or mixed method evidence reviews. Although the updated methodology produced by the work of the working group has improved the JBI guidance for systematic reviews of economic evaluations, there are areas where further work is required. These include adjusting the critical appraisal tool to separate out questions addressing intervention cost and effectiveness measurement; providing more explicit guidance for assessing generalizability of findings; and offering a more robust method for evidence synthesis that facilitates achieving the more ambitious review objectives.

  7. High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away

    NASA Astrophysics Data System (ADS)

    Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.

    2012-09-01

    By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.

  8. Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.

    2007-12-01

    Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.

  9. The 10th Annual Bioassays and Bioanalytical Method Development Conference.

    PubMed

    Ma, Mark; Tudan, Christopher; Koltchev, Dolly

    2015-01-01

    The 10th Annual Bioassays and Bioanalytical Method Development Conference was hosted in Boston, MA, USA on 20-22 October 2014. This meeting brought together scientists from the biopharmaceutical and life sciences industries, the regulatory agency and academia to share and discuss current trends in cell-based assays and bioanalysis, challenges and ideas for the future of the bioassays and bioanalytical method development. The experiences associated with new and innovative technologies were evaluated as well as their impact on the current bioassays methodologies and bioanalysis workflow, including quality, feasibility, outsourcing strategies and challenges, productivity and compliance. Several presentations were also provided by members of the US FDA, sharing both scientific and regulatory paradigms including a most recent update on the position of the FDA with specific aspects of the draft Bioanalytical Method Validation guidance following its review of the industry's responses. The meeting was jointly coincided with the 15th Annual Immunogenicity for Biotherapeutics meeting, allowing for attendees to also familiarize themselves with new and emerging approaches to overcome the effect of immunogenicity, in addition to investigative strategies.

  10. Validation of Metagenomic Next-Generation Sequencing Tests for Universal Pathogen Detection.

    PubMed

    Schlaberg, Robert; Chiu, Charles Y; Miller, Steve; Procop, Gary W; Weinstock, George

    2017-06-01

    - Metagenomic sequencing can be used for detection of any pathogens using unbiased, shotgun next-generation sequencing (NGS), without the need for sequence-specific amplification. Proof-of-concept has been demonstrated in infectious disease outbreaks of unknown causes and in patients with suspected infections but negative results for conventional tests. Metagenomic NGS tests hold great promise to improve infectious disease diagnostics, especially in immunocompromised and critically ill patients. - To discuss challenges and provide example solutions for validating metagenomic pathogen detection tests in clinical laboratories. A summary of current regulatory requirements, largely based on prior guidance for NGS testing in constitutional genetics and oncology, is provided. - Examples from 2 separate validation studies are provided for steps from assay design, and validation of wet bench and bioinformatics protocols, to quality control and assurance. - Although laboratory and data analysis workflows are still complex, metagenomic NGS tests for infectious diseases are increasingly being validated in clinical laboratories. Many parallels exist to NGS tests in other fields. Nevertheless, specimen preparation, rapidly evolving data analysis algorithms, and incomplete reference sequence databases are idiosyncratic to the field of microbiology and often overlooked.

  11. Distributed digital music archives and libraries

    NASA Astrophysics Data System (ADS)

    Fujinaga, Ichiro

    2005-09-01

    The main goal of this research program is to develop and evaluate practices, frameworks, and tools for the design and construction of worldwide distributed digital music archives and libraries. Over the last few millennia, humans have amassed an enormous amount of musical information that is scattered around the world. It is becoming abundantly clear that the optimal path for acquisition is to distribute the task of digitizing the wealth of historical and cultural heritage material that exists in analogue formats, which may include books and manuscripts related to music, music scores, photographs, videos, audio tapes, and phonograph records. In order to achieve this goal, libraries, museums, and archives throughout the world, large or small, need well-researched policies, proper guidance, and efficient tools to digitize their collections and to make them available economically. The research conducted within the program addresses unique and imminent challenges posed by the digitization and dissemination of music media. The are four major research projects in progress: development and evaluation of digitization methods for preservation of analogue recordings; optical music recognition using microfilms; design of workflow management system with automatic metadata extraction; and formulation of interlibrary communication strategies.

  12. Binocular Goggle Augmented Imaging and Navigation System provides real-time fluorescence image guidance for tumor resection and sentinel lymph node mapping

    NASA Astrophysics Data System (ADS)

    B. Mondal, Suman; Gao, Shengkui; Zhu, Nan; Sudlow, Gail P.; Liang, Kexian; Som, Avik; Akers, Walter J.; Fields, Ryan C.; Margenthaler, Julie; Liang, Rongguang; Gruev, Viktor; Achilefu, Samuel

    2015-07-01

    The inability to identify microscopic tumors and assess surgical margins in real-time during oncologic surgery leads to incomplete tumor removal, increases the chances of tumor recurrence, and necessitates costly repeat surgery. To overcome these challenges, we have developed a wearable goggle augmented imaging and navigation system (GAINS) that can provide accurate intraoperative visualization of tumors and sentinel lymph nodes in real-time without disrupting normal surgical workflow. GAINS projects both near-infrared fluorescence from tumors and the natural color images of tissue onto a head-mounted display without latency. Aided by tumor-targeted contrast agents, the system detected tumors in subcutaneous and metastatic mouse models with high accuracy (sensitivity = 100%, specificity = 98% ± 5% standard deviation). Human pilot studies in breast cancer and melanoma patients using a near-infrared dye show that the GAINS detected sentinel lymph nodes with 100% sensitivity. Clinical use of the GAINS to guide tumor resection and sentinel lymph node mapping promises to improve surgical outcomes, reduce rates of repeat surgery, and improve the accuracy of cancer staging.

  13. Toward integrated image guided liver surgery

    NASA Astrophysics Data System (ADS)

    Jarnagin, W. R.; Simpson, Amber L.; Miga, M. I.

    2017-03-01

    While clinical neurosurgery has benefited from the advent of frameless image guidance for over three decades, the translation of image guided technologies to abdominal surgery, and more specifically liver resection, has been far more limited. Fundamentally, the workflow, complexity, and presentation have confounded development. With the first real efforts in translation beginning at the turn of the millennia, the work in developing novel augmented technologies to enhance screening, planning, and surgery has come to realization for the field. In this paper, we will review several examples from our own work that demonstrate the impact of image-guided procedure methods in eight clinical studies that speak to: (1) the accuracy in planning for liver resection, (2) enhanced surgical planning with portal vein embolization impact, (3) linking splenic volume changes to post-hepatectomy complications, (4) enhanced intraoperative localization in surgically occult lesions, (5) validation of deformation correction, and a (6) a novel blinded study focused at the value of deformation correction. All six of these studies were achieved in human systems and show the potential impact image guided methodologies could make on liver tissue resection procedures.

  14. The Role of Laser Speckle Imaging in Port-Wine Stain Research: Recent Advances and Opportunities

    PubMed Central

    Choi, Bernard; Tan, Wenbin; Jia, Wangcun; White, Sean M.; Moy, Wesley J.; Yang, Bruce Y.; Zhu, Jiang; Chen, Zhongping; Kelly, Kristen M.; Nelson, J. Stuart

    2016-01-01

    Here, we review our current knowledge on the etiology and treatment of port-wine stain (PWS) birthmarks. Current treatment options have significant limitations in terms of efficacy. With the combination of 1) a suitable preclinical microvascular model, 2) laser speckle imaging (LSI) to evaluate blood-flow dynamics, and 3) a longitudinal experimental design, rapid preclinical assessment of new phototherapies can be translated from the lab to the clinic. The combination of photodynamic therapy (PDT) and pulsed-dye laser (PDL) irradiation achieves a synergistic effect that reduces the required radiant exposures of the individual phototherapies to achieve persistent vascular shutdown. PDL combined with anti-angiogenic agents is a promising strategy to achieve persistent vascular shutdown by preventing reformation and reperfusion of photocoagulated blood vessels. Integration of LSI into the clinical workflow may lead to surgical image guidance that maximizes acute photocoagulation, is expected to improve PWS therapeutic outcome. Continued integration of noninvasive optical imaging technologies and biochemical analysis collectively are expected to lead to more robust treatment strategies. PMID:27013846

  15. Augmented reality in surgical procedures

    NASA Astrophysics Data System (ADS)

    Samset, E.; Schmalstieg, D.; Vander Sloten, J.; Freudenthal, A.; Declerck, J.; Casciaro, S.; Rideng, Ø.; Gersak, B.

    2008-02-01

    Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

  16. The equivalency between logic Petri workflow nets and workflow nets.

    PubMed

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin

    The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less

  18. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    PubMed Central

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  19. A Formal Investigation of the Organization of Guidance Behavior: Implications for Humans and Autonomous Guidance

    NASA Astrophysics Data System (ADS)

    Kong, Zhaodan

    Guidance behavior generated either by artificial agents or humans has been actively studied in the fields of both robotics and cognitive science. The goals of these two fields are different. The former is the automatic generation of appropriate or even optimal behavior, while the latter is the understanding of the underlying mechanism. Their challenges, though, are closely related, the most important one being the lack of a unified, formal and grounded framework where the guidance behavior can be modeled and studied. This dissertation presents such a framework. In this framework, guidance behavior is analyzed as the closed-loop dynamics of the whole agent-environment system. The resulting dynamics give rise to interaction patterns. The central points of this dissertation are that: first of all, these patterns, which can be explained in terms of symmetries that are inherent to the guidance behavior, provide building blocks for the organization of behavior; second, the existence of these patterns and humans' organization of their guidance behavior based on these patterns are the reasons that humans can generate successful behavior in spite of all the complexities involved in the planning and control. This dissertation first gives an overview of the challenges existing in both scientific endeavors, such as human and animal spatial behavior study, and engineering endeavors, such as autonomous guidance system design. It then lays out the foundation for our formal framework, which states that guidance behavior should be interpreted as the collection of the closed-loop dynamics resulting from the agent's interaction with the environment. The following, illustrated by examples of three different UAVs, shows that the study of the closed-loop dynamics should not be done without the consideration of vehicle dynamics, as is the common practice in some of the studies in both autonomous guidance and human behavior analysis. The framework, the core concepts of which are symmetries and interaction patterns, is then elaborated on with the example of Dubins' vehicle's guidance behavior. The dissertation then describes the details of the agile human guidance experiments using miniature helicopters, the technique that is developed for the analysis of the experimental data and the analysis results. The results confirm that human guidance behavior indeed exhibits invariance as defined by interaction patterns. Subsequently, the behavior in each interaction pattern is investigated using piecewise affine model identification. Combined, the results provide a natural and formal decomposition of the behavior that can be unified under a hierarchical hidden Markov model. By employing the languages of dynamical system and control and by adopting algorithms from system identification and machine learning, the framework presented in this dissertation provides a fertile ground where these different disciplines can meet. It also promises multiple potential directions where future research can be headed.

  20. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  1. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    PubMed Central

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  2. Enabling Real-time Water Decision Support Services Using Model as a Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  3. Process-Structure Linkages Using a Data Science Approach: Application to Simulated Additive Manufacturing Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi

    A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less

  4. Characterizing Strain Variation in Engineered E. coli Using a Multi-Omics-Based Workflow

    DOE PAGES

    Brunk, Elizabeth; George, Kevin W.; Alonso-Gutierrez, Jorge; ...

    2016-05-19

    Understanding the complex interactions that occur between heterologous and native biochemical pathways represents a major challenge in metabolic engineering and synthetic biology. We present a workflow that integrates metabolomics, proteomics, and genome-scale models of Escherichia coli metabolism to study the effects of introducing a heterologous pathway into a microbial host. This workflow incorporates complementary approaches from computational systems biology, metabolic engineering, and synthetic biology; provides molecular insight into how the host organism microenvironment changes due to pathway engineering; and demonstrates how biological mechanisms underlying strain variation can be exploited as an engineering strategy to increase product yield. As a proofmore » of concept, we present the analysis of eight engineered strains producing three biofuels: isopentenol, limonene, and bisabolene. Application of this workflow identified the roles of candidate genes, pathways, and biochemical reactions in observed experimental phenomena and facilitated the construction of a mutant strain with improved productivity. The contributed workflow is available as an open-source tool in the form of iPython notebooks.« less

  5. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    PubMed

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  6. Workflow interruptions, cognitive failure and near-accidents in health care.

    PubMed

    Elfering, Achim; Grebner, Simone; Ebener, Corinne

    2015-01-01

    Errors are frequent in health care. A specific model was tested that affirms failure in cognitive action regulation to mediate the influence of nurses' workflow interruptions and safety conscientiousness on near-accidents in health care. One hundred and sixty-five nurses from seven Swiss hospitals participated in a questionnaire survey. Structural equation modelling confirmed the hypothesised mediation model. Cognitive failure in action regulation significantly mediated the influence of workflow interruptions on near-accidents (p < .05). An indirect path from conscientiousness to near-accidents via cognitive failure in action regulation was also significant (p < .05). Compliance with safety regulations was significantly related to cognitive failure and near-accidents; moreover, cognitive failure mediated the association between compliance and near-accidents (p < .05). Contrary to expectations, compliance with safety regulations was not related to workflow interruptions. Workflow interruptions caused by colleagues, patients and organisational constraints are likely to trigger errors in nursing. Work redesign is recommended to reduce cognitive failure and improve safety of nurses and patients.

  7. Requirements for Workflow-Based EHR Systems - Results of a Qualitative Study.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2016-01-01

    Today's high quality healthcare delivery strongly relies on efficient electronic health records (EHR). These EHR systems or in general healthcare IT-systems are usually developed in a static manner according to a given workflow. Hence, they are not flexible enough to enable access to EHR data and to execute individual actions within a consultation. This paper reports on requirements pointed by experts in the domain of diabetes mellitus to design a system for supporting dynamic workflows to serve personalization within a medical activity. Requirements were collected by means of expert interviews. These interviews completed a conducted triangulation approach, aimed to gather requirements for workflow-based EHR interactions. The data from the interviews was analyzed through a qualitative approach resulting in a set of requirements enhancing EHR functionality from the user's perspective. Requirements were classified according to four different categorizations: (1) process-related requirements, (2) information needs, (3) required functions, (4) non-functional requirements. Workflow related requirements were identified which should be considered when developing and deploying EHR systems.

  8. Camera-augmented mobile C-arm (CamC): A feasibility study of augmented reality imaging in the operating room.

    PubMed

    von der Heide, Anna Maria; Fallavollita, Pascal; Wang, Lejing; Sandner, Philipp; Navab, Nassir; Weidert, Simon; Euler, Ekkehard

    2018-04-01

    In orthopaedic trauma surgery, image-guided procedures are mostly based on fluoroscopy. The reduction of radiation exposure is an important goal. The purpose of this work was to investigate the impact of a camera-augmented mobile C-arm (CamC) on radiation exposure and the surgical workflow during a first clinical trial. Applying a workflow-oriented approach, 10 general workflow steps were defined to compare the CamC to traditional C-arms. The surgeries included were arbitrarily identified and assigned to the study. The evaluation criteria were radiation exposure and operation time for each workflow step and the entire surgery. The evaluation protocol was designed and conducted in a single-centre study. The radiation exposure was remarkably reduced by 18 X-ray shots 46% using the CamC while keeping similar surgery times. The intuitiveness of the system, its easy integration into the surgical workflow, and its great potential to reduce radiation have been demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Barriers to critical thinking: workflow interruptions and task switching among nurses.

    PubMed

    Cornell, Paul; Riordan, Monica; Townsend-Gervis, Mary; Mobley, Robin

    2011-10-01

    Nurses are increasingly called upon to engage in critical thinking. However, current workflow inhibits this goal with frequent task switching and unpredictable demands. To assess workflow's cognitive impact, nurses were observed at 2 hospitals with different patient loads and acuity levels. Workflow on a medical/surgical and pediatric oncology unit was observed, recording tasks, tools, collaborators, and locations. Nineteen nurses were observed for a total of 85.2 hours. Tasks were short with a mean duration of 62.4 and 81.6 seconds on the 2 units. More than 50% of the recorded tasks were less than 30 seconds in length. An analysis of task sequence revealed few patterns and little pairwise repetition. Performance on specific tasks differed between the 2 units, but the character of the workflow was highly similar. The nonrepetitive flow and high amount of switching indicate nurses experience a heavy cognitive load with little uninterrupted time. This implies that nurses rarely have the conditions necessary for critical thinking.

  10. Advantages and Disadvantages of 1-Incision, 2-Incision, 3-Incision, and 4-Incision Laparoscopic Cholecystectomy: A Workflow Comparison Study.

    PubMed

    Bartnicka, Joanna; Zietkiewicz, Agnieszka A; Kowalski, Grzegorz J

    2016-08-01

    A comparison of 1-port, 2-port, 3-port, and 4-port laparoscopic cholecystectomy techniques from the point of view of workflow criteria was made to both identify specific workflow components that can cause surgical disturbances and indicate good and bad practices. As a case study, laparoscopic cholecystectomies, including manual tasks and interactions within teamwork members, were video-recorded and analyzed on the basis of specially encoded workflow information. The parameters for comparison were defined as follows: surgery time, tool and hand activeness, operator's passive work, collisions, and operator interventions. It was found that 1-port cholecystectomy is the worst technique because of nonergonomic body position, technical complexity, organizational anomalies, and operational dynamism. The differences between laparoscopic techniques are closely linked to the costs of the medical procedures. Hence, knowledge about the surgical workflow can be used for both planning surgical procedures and balancing the expenses associated with surgery.

  11. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  12. Process-Structure Linkages Using a Data Science Approach: Application to Simulated Additive Manufacturing Data

    DOE PAGES

    Popova, Evdokia; Rodgers, Theron M.; Gong, Xinyi; ...

    2017-03-13

    A novel data science workflow is developed and demonstrated to extract process-structure linkages (i.e., reduced-order model) for microstructure evolution problems when the final microstructure depends on (simulation or experimental) processing parameters. Our workflow consists of four main steps: data pre-processing, microstructure quantification, dimensionality reduction, and extraction/validation of process-structure linkages. These methods that can be employed within each step vary based on the type and amount of available data. In this paper, this data-driven workflow is applied to a set of synthetic additive manufacturing microstructures obtained using the Potts-kinetic Monte Carlo (kMC) approach. Additive manufacturing techniques inherently produce complex microstructures thatmore » can vary significantly with processing conditions. Using the developed workflow, a low-dimensional data-driven model was established to correlate process parameters with the predicted final microstructure. In addition, the modular workflows developed and presented in this work facilitate easy dissemination and curation by the broader community.« less

  13. Workflow computing. Improving management and efficiency of pathology diagnostic services.

    PubMed

    Buffone, G J; Moreau, D; Beck, J R

    1996-04-01

    Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.

  14. Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.

  15. Radiology Workflow Dynamics: How Workflow Patterns Impact Radiologist Perceptions of Workplace Satisfaction.

    PubMed

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum; Field, Aaron; Wiegmann, Douglas; Yu, John-Paul J

    2017-04-01

    The study aimed to assess perceptions of reading room workflow and the impact separating image-interpretive and nonimage-interpretive task workflows can have on radiologist perceptions of workplace disruptions, workload, and overall satisfaction. A 14-question survey instrument was developed to measure radiologist perceptions of workplace interruptions, satisfaction, and workload prior to and following implementation of separate image-interpretive and nonimage-interpretive reading room workflows. The results were collected over 2 weeks preceding the intervention and 2 weeks following the end of the intervention. The results were anonymized and analyzed using univariate analysis. A total of 18 people responded to the preintervention survey: 6 neuroradiology fellows and 12 attending neuroradiologists. Fifteen people who were then present for the 1-month intervention period responded to the postintervention survey. Perceptions of workplace disruptions, image interpretation, quality of trainee education, ability to perform nonimage-interpretive tasks, and quality of consultations (P < 0.0001) all improved following the intervention. Mental effort and workload also improved across all assessment domains, as did satisfaction with quality of image interpretation and consultative work. Implementation of parallel dedicated image-interpretive and nonimage-interpretive workflows may improve markers of radiologist perceptions of workplace satisfaction. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  17. Managing the life cycle of electronic clinical documents.

    PubMed

    Payne, Thomas H; Graham, Gail

    2006-01-01

    To develop a model of the life cycle of clinical documents from inception to use in a person's medical record, including workflow requirements from clinical practice, local policy, and regulation. We propose a model for the life cycle of clinical documents as a framework for research on documentation within electronic medical record (EMR) systems. Our proposed model includes three axes: the stages of the document, the roles of those involved with the document, and the actions those involved may take on the document at each stage. The model includes the rules to describe who (in what role) can perform what actions on the document, and at what stages they can perform them. Rules are derived from needs of clinicians, and requirements of hospital bylaws and regulators. Our model encompasses current practices for paper medical records and workflow in some EMR systems. Commercial EMR systems include methods for implementing document workflow rules. Workflow rules that are part of this model mirror functionality in the Department of Veterans Affairs (VA) EMR system where the Authorization/ Subscription Utility permits document life cycle rules to be written in English-like fashion. Creating a model of the life cycle of clinical documents serves as a framework for discussion of document workflow, how rules governing workflow can be implemented in EMR systems, and future research of electronic documentation.

  18. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    PubMed

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  20. chemalot and chemalot_knime: Command line programs as workflow tools for drug discovery.

    PubMed

    Lee, Man-Ling; Aliagas, Ignacio; Feng, Jianwen A; Gabriel, Thomas; O'Donnell, T J; Sellers, Benjamin D; Wiswedel, Bernd; Gobbi, Alberto

    2017-06-12

    Analyzing files containing chemical information is at the core of cheminformatics. Each analysis may require a unique workflow. This paper describes the chemalot and chemalot_knime open source packages. Chemalot is a set of command line programs with a wide range of functionalities for cheminformatics. The chemalot_knime package allows command line programs that read and write SD files from stdin and to stdout to be wrapped into KNIME nodes. The combination of chemalot and chemalot_knime not only facilitates the compilation and maintenance of sequences of command line programs but also allows KNIME workflows to take advantage of the compute power of a LINUX cluster. Use of the command line programs is demonstrated in three different workflow examples: (1) A workflow to create a data file with project-relevant data for structure-activity or property analysis and other type of investigations, (2) The creation of a quantitative structure-property-relationship model using the command line programs via KNIME nodes, and (3) The analysis of strain energy in small molecule ligand conformations from the Protein Data Bank database. The chemalot and chemalot_knime packages provide lightweight and powerful tools for many tasks in cheminformatics. They are easily integrated with other open source and commercial command line tools and can be combined to build new and even more powerful tools. The chemalot_knime package facilitates the generation and maintenance of user-defined command line workflows, taking advantage of the graphical design capabilities in KNIME. Graphical abstract Example KNIME workflow with chemalot nodes and the corresponding command line pipe.

  1. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  2. Targeting Accuracy, Procedure Times and User Experience of 240 Experimental MRI Biopsies Guided by a Clinical Add-On Navigation System

    PubMed Central

    Busse, Harald; Riedel, Tim; Garnov, Nikita; Thörmer, Gregor; Kahn, Thomas; Moche, Michael

    2015-01-01

    Objectives MRI is of great clinical utility for the guidance of special diagnostic and therapeutic interventions. The majority of such procedures are performed iteratively ("in-and-out") in standard, closed-bore MRI systems with control imaging inside the bore and needle adjustments outside the bore. The fundamental limitations of such an approach have led to the development of various assistance techniques, from simple guidance tools to advanced navigation systems. The purpose of this work was to thoroughly assess the targeting accuracy, workflow and usability of a clinical add-on navigation solution on 240 simulated biopsies by different medical operators. Methods Navigation relied on a virtual 3D MRI scene with real-time overlay of the optically tracked biopsy needle. Smart reference markers on a freely adjustable arm ensured proper registration. Twenty-four operators – attending (AR) and resident radiologists (RR) as well as medical students (MS) – performed well-controlled biopsies of 10 embedded model targets (mean diameter: 8.5 mm, insertion depths: 17-76 mm). Targeting accuracy, procedure times and 13 Likert scores on system performance were determined (strong agreement: 5.0). Results Differences in diagnostic success rates (AR: 93%, RR: 88%, MS: 81%) were not significant. In contrast, between-group differences in biopsy times (AR: 4:15, RR: 4:40, MS: 5:06 min:sec) differed significantly (p<0.01). Mean overall rating was 4.2. The average operator would use the system again (4.8) and stated that the outcome justifies the extra effort (4.4). Lowest agreement was reported for the robustness against external perturbations (2.8). Conclusions The described combination of optical tracking technology with an automatic MRI registration appears to be sufficiently accurate for instrument guidance in a standard (closed-bore) MRI environment. High targeting accuracy and usability was demonstrated on a relatively large number of procedures and operators. Between groups with different expertise there were significant differences in experimental procedure times but not in the number of successful biopsies. PMID:26222443

  3. Understanding the dispensary workflow at the Birmingham Free Clinic: a proposed framework for an informatics intervention.

    PubMed

    Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P

    2016-02-19

    The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC dispensary, we believe it will be generalizable to pharmacies in other low-resource settings, both domestically and internationally.

  4. Differential gene expression in the siphonophore Nanomia bijuga (Cnidaria) assessed with multiple next-generation sequencing workflows.

    PubMed

    Siebert, Stefan; Robinson, Mark D; Tintori, Sophia C; Goetz, Freya; Helm, Rebecca R; Smith, Stephen A; Shaner, Nathan; Haddock, Steven H D; Dunn, Casey W

    2011-01-01

    We investigated differential gene expression between functionally specialized feeding polyps and swimming medusae in the siphonophore Nanomia bijuga (Cnidaria) with a hybrid long-read/short-read sequencing strategy. We assembled a set of partial gene reference sequences from long-read data (Roche 454), and generated short-read sequences from replicated tissue samples that were mapped to the references to quantify expression. We collected and compared expression data with three short-read expression workflows that differ in sample preparation, sequencing technology, and mapping tools. These workflows were Illumina mRNA-Seq, which generates sequence reads from random locations along each transcript, and two tag-based approaches, SOLiD SAGE and Helicos DGE, which generate reads from particular tag sites. Differences in expression results across workflows were mostly due to the differential impact of missing data in the partial reference sequences. When all 454-derived gene reference sequences were considered, Illumina mRNA-Seq detected more than twice as many differentially expressed (DE) reference sequences as the tag-based workflows. This discrepancy was largely due to missing tag sites in the partial reference that led to false negatives in the tag-based workflows. When only the subset of reference sequences that unambiguously have tag sites was considered, we found broad congruence across workflows, and they all identified a similar set of DE sequences. Our results are promising in several regards for gene expression studies in non-model organisms. First, we demonstrate that a hybrid long-read/short-read sequencing strategy is an effective way to collect gene expression data when an annotated genome sequence is not available. Second, our replicated sampling indicates that expression profiles are highly consistent across field-collected animals in this case. Third, the impacts of partial reference sequences on the ability to detect DE can be mitigated through workflow choice and deeper reference sequencing.

  5. Differential Gene Expression in the Siphonophore Nanomia bijuga (Cnidaria) Assessed with Multiple Next-Generation Sequencing Workflows

    PubMed Central

    Siebert, Stefan; Robinson, Mark D.; Tintori, Sophia C.; Goetz, Freya; Helm, Rebecca R.; Smith, Stephen A.; Shaner, Nathan; Haddock, Steven H. D.; Dunn, Casey W.

    2011-01-01

    We investigated differential gene expression between functionally specialized feeding polyps and swimming medusae in the siphonophore Nanomia bijuga (Cnidaria) with a hybrid long-read/short-read sequencing strategy. We assembled a set of partial gene reference sequences from long-read data (Roche 454), and generated short-read sequences from replicated tissue samples that were mapped to the references to quantify expression. We collected and compared expression data with three short-read expression workflows that differ in sample preparation, sequencing technology, and mapping tools. These workflows were Illumina mRNA-Seq, which generates sequence reads from random locations along each transcript, and two tag-based approaches, SOLiD SAGE and Helicos DGE, which generate reads from particular tag sites. Differences in expression results across workflows were mostly due to the differential impact of missing data in the partial reference sequences. When all 454-derived gene reference sequences were considered, Illumina mRNA-Seq detected more than twice as many differentially expressed (DE) reference sequences as the tag-based workflows. This discrepancy was largely due to missing tag sites in the partial reference that led to false negatives in the tag-based workflows. When only the subset of reference sequences that unambiguously have tag sites was considered, we found broad congruence across workflows, and they all identified a similar set of DE sequences. Our results are promising in several regards for gene expression studies in non-model organisms. First, we demonstrate that a hybrid long-read/short-read sequencing strategy is an effective way to collect gene expression data when an annotated genome sequence is not available. Second, our replicated sampling indicates that expression profiles are highly consistent across field-collected animals in this case. Third, the impacts of partial reference sequences on the ability to detect DE can be mitigated through workflow choice and deeper reference sequencing. PMID:21829563

  6. Innovations in Medication Preparation Safety and Wastage Reduction: Use of a Workflow Management System in a Pediatric Hospital.

    PubMed

    Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara

    2017-01-01

    Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

  7. Seamless online science workflow development and collaboration using IDL and the ENVI Services Engine

    NASA Astrophysics Data System (ADS)

    Harris, A. T.; Ramachandran, R.; Maskey, M.

    2013-12-01

    The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL Workbench, coupled with ESE for execution in the cloud, asynchronous workflows could be executed in batch mode on large data in the cloud. We envision that a scientist will initially develop a scientific workflow locally on a small set of data. Once tested, the scientist will deploy the workflow to the cloud for execution. Depending on the results, the scientist may share the workflow and results, allowing them to be stored in a community catalog and instantly loaded into the IDL Workbench of other scientists. Thereupon, scientists can clone and modify or execute the workflow with different input parameters. The Collaborative Workbench will provide a platform for collaboration in the cloud, helping Earth scientists solve big-data problems in the Earth and planetary sciences.

  8. Scientific Workflows + Provenance = Better (Meta-)Data Management

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.

  9. Cuban Value Orientations. Cultural Monograph Number 1. Bilingual Multicultural Education Training Project for School Psychologists and Guidance Counselors.

    ERIC Educational Resources Information Center

    Florida Univ., Gainesville. Coll. of Education.

    Information is provided in this monograph to facilitate understanding of Cuban cultural values and their implications for counseling limited English proficient students. Also provided is a beginning conceptual model for increasing the understanding of cross-cultural theory and practice. The overview of Cuban culture is based on a perspective of…

  10. Guidance for ePortfolio Researchers: A Case Study with Implications for the ePortfolio Domain

    ERIC Educational Resources Information Center

    Kennelly, Emily; Osborn, Debra; Reardon, Robert; Shetty, Becka

    2016-01-01

    This study examined whether or not students using a career ePortfolio, including a matrix for identifying and reflecting on transferrable skills, enabled them to rate their skills more confidently and positively after a simulated (mock) job interview. Three groups were studied: those completing the skills matrix in the ePortfolio; those using the…

  11. Title IX: An Overview of the Law for Students. A Student Guide to Equal Rights: Part 2.

    ERIC Educational Resources Information Center

    Wiegers, Nancy; And Others

    Title IX is a Federal law prohibiting discrimination in education on the basis of sex. This booklet was written to introduce students to the Law and its implications. Topics covered include: (1) schools affected by the regulations; (2) admissions to schools; (3) entrance to courses; (4) counseling and guidance; (5) extracurricular activities; (6)…

  12. It's All About the Data: Workflow Systems and Weather

    NASA Astrophysics Data System (ADS)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.

  13. Dose limits to the lens of the eye: International Basic Safety Standards and related guidance.

    PubMed

    Boal, T J; Pinak, M

    2015-06-01

    The International Atomic Energy Agency (IAEA) safety requirements: 'General Safety Requirements Part 3--Radiation protection and safety of radiation sources: International Basic Safety Standards' (BSS) was approved by the IAEA Board of Governors at its meeting in September 2011, and was issued as General Safety Requirements Part 3 in July 2014. The equivalent dose limit for the lens of the eye for occupational exposure in planned exposure situations was reduced from 150 mSv year(-1) to 20 mSv year(-1), averaged over defined periods of 5 years, with no annual dose in a single year exceeding 50 mSv. This reduction in the dose limit for the lens of the eye followed the recommendation of the International Commission on Radiological Protection in its statement on tissue reactions of 21 April 2011. IAEA has developed guidance on the implications of the new dose limit for the lens of the eye. This paper summarises the process that led to the inclusion of the new dose limit for the lens of the eye in the BSS, and the implications of the new dose limit. © The International Society for Prosthetics and Orthotics Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  14. A strategy for selecting sexual partners believed to pose little/no risks for HIV: serosorting and its implications for HIV transmission.

    PubMed

    Eaton, Lisa A; Kalichman, Seth C; O'Connell, Daniel A; Karchner, William D

    2009-10-01

    A common HIV/AIDS risk reduction strategy among men who have sex with men (MSM) is to limit their unprotected sex partners to those who are of the same HIV status, a practice referred to as serosorting. Decisions to serosort for HIV risk reduction are based on personal impressions and beliefs, and there is limited guidance offered on this community derived strategy from public health services. This paper reviews research on serosorting for HIV risk reduction and offers an evidence-based approach to serosorting guidance. Following a comprehensive electronic and manual literature search, we reviewed 51 studies relating to the implications of serosorting. Studies showed that HIV negative MSM who select partners based on HIV status are inadvertently placing themselves at risk for HIV. Infrequent HIV testing, lack of HIV status disclosure, co-occurring sexually transmitted infections, and acute HIV infection impede the potential protective benefits of serosorting. Public health messages should continue to encourage reductions in numbers of sexual partners and increases in condom use. Risk reduction messages should also highlight the limitations of relying on one's own and partner's HIV status in making sexual risk decisions.

  15. Observing Clonal Dynamics across Spatiotemporal Axes: A Prelude to Quantitative Fitness Models for Cancer.

    PubMed

    McPherson, Andrew W; Chan, Fong Chun; Shah, Sohrab P

    2018-02-01

    The ability to accurately model evolutionary dynamics in cancer would allow for prediction of progression and response to therapy. As a prelude to quantitative understanding of evolutionary dynamics, researchers must gather observations of in vivo tumor evolution. High-throughput genome sequencing now provides the means to profile the mutational content of evolving tumor clones from patient biopsies. Together with the development of models of tumor evolution, reconstructing evolutionary histories of individual tumors generates hypotheses about the dynamics of evolution that produced the observed clones. In this review, we provide a brief overview of the concepts involved in predicting evolutionary histories, and provide a workflow based on bulk and targeted-genome sequencing. We then describe the application of this workflow to time series data obtained for transformed and progressed follicular lymphomas (FL), and contrast the observed evolutionary dynamics between these two subtypes. We next describe results from a spatial sampling study of high-grade serous (HGS) ovarian cancer, propose mechanisms of disease spread based on the observed clonal mixtures, and provide examples of diversification through subclonal acquisition of driver mutations and convergent evolution. Finally, we state implications of the techniques discussed in this review as a necessary but insufficient step on the path to predictive modelling of disease dynamics. Copyright © 2018 Cold Spring Harbor Laboratory Press; all rights reserved.

  16. A Guide for Designing and Analyzing RNA-Seq Data.

    PubMed

    Chatterjee, Aniruddha; Ahn, Antonio; Rodger, Euan J; Stockwell, Peter A; Eccles, Michael R

    2018-01-01

    The identity of a cell or an organism is at least in part defined by its gene expression and therefore analyzing gene expression remains one of the most frequently performed experimental techniques in molecular biology. The development of the RNA-Sequencing (RNA-Seq) method allows an unprecedented opportunity to analyze expression of protein-coding, noncoding RNA and also de novo transcript assembly of a new species or organism. However, the planning and design of RNA-Seq experiments has important implications for addressing the desired biological question and maximizing the value of the data obtained. In addition, RNA-Seq generates a huge volume of data and accurate analysis of this data involves several different steps and choices of tools. This can be challenging and overwhelming, especially for bench scientists. In this chapter, we describe an entire workflow for performing RNA-Seq experiments. We describe critical aspects of wet lab experiments such as RNA isolation, library preparation and the initial design of an experiment. Further, we provide a step-by-step description of the bioinformatics workflow for different steps involved in RNA-Seq data analysis. This includes power calculations, setting up a computational environment, acquisition and processing of publicly available data if desired, quality control measures, preprocessing steps for the raw data, differential expression analysis, and data visualization. We particularly mention important considerations for each step to provide a guide for designing and analyzing RNA-Seq data.

  17. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    NASA Astrophysics Data System (ADS)

    Bowen, S. R.; Nyflot, M. J.; Herrmann, C.; Groh, C. M.; Meyer, J.; Wollenweber, S. D.; Stearns, C. W.; Kinahan, P. E.; Sandison, G. A.

    2015-05-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the magnitude of errors was comparable during PET/CT imaging and treatment delivery without motion compensation. Errors were moderately mitigated during PET/CT imaging and significantly mitigated during RT delivery with motion compensation. This dynamic motion phantom end-to-end workflow provides a method for quality assurance of 4D PET/CT-guided radiotherapy, including evaluation of respiratory motion compensation methods during imaging and treatment delivery.

  18. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study.

    PubMed

    Bowen, S R; Nyflot, M J; Herrmann, C; Groh, C M; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-05-07

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [(18)F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the magnitude of errors was comparable during PET/CT imaging and treatment delivery without motion compensation. Errors were moderately mitigated during PET/CT imaging and significantly mitigated during RT delivery with motion compensation. This dynamic motion phantom end-to-end workflow provides a method for quality assurance of 4D PET/CT-guided radiotherapy, including evaluation of respiratory motion compensation methods during imaging and treatment delivery.

  19. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    PubMed Central

    Bowen, S R; Nyflot, M J; Hermann, C; Groh, C; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-01-01

    Effective positron emission tomography/computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by 6 different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy (VMAT) were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses (EUD), and 2%-2mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, < 5% in treatment planning, and < 2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT delivery under a dose painting paradigm is feasible within an integrated respiratory motion phantom workflow. For a limited set of cases, the magnitude of errors was comparable during PET/CT imaging and treatment delivery without motion compensation. Errors were moderately mitigated during PET/CT imaging and significantly mitigated during RT delivery with motion compensation. This dynamic motion phantom end-to-end workflow provides a method for quality assurance of 4D PET/CT-guided radiotherapy, including evaluation of respiratory motion compensation methods during imaging and treatment delivery. PMID:25884892

  20. Design of a Tablet Computer App for Facilitation of a Molecular Blood Culture Test in Clinical Microbiology and Preliminary Usability Evaluation.

    PubMed

    Samson, Lasse L; Pape-Haugaard, Louise; Meltzer, Michelle C; Fuchs, Martin; Schønheyder, Henrik C; Hejlesen, Ole

    2016-03-18

    User mobility is an important aspect of the development of clinical information systems for health care professionals. Mobile phones and tablet computers have obtained widespread use by health care professionals, offering an opportunity for supporting the access to patient information through specialized applications (apps) while supporting the mobility of the users. The use of apps for mobile phones and tablet computers may support workflow of complex tasks, for example, molecular-based diagnostic tests in clinical microbiology. Multiplex Blood Culture Test (MuxBCT) is a molecular-based diagnostic test used for rapid identification of pathogens in positive blood cultures. To facilitate the workflow of the MuxBCT, a specialized tablet computer app was developed as an accessory to the diagnostic test. The app aims to reduce the complexity of the test by step-by-step guidance of microscopy and to assist users in reaching an exact bacterial or fungal diagnosis based on blood specimen observations and controls. Additionally, the app allows for entry of test results, and communication thereof to the laboratory information system (LIS). The objective of the study was to describe the design considerations of the MuxBCT app and the results of a preliminary usability evaluation. The MuxBCT tablet app was developed and set up for use in a clinical microbiology laboratory. A near-live simulation study was conducted in the clinical microbiology laboratory to evaluate the usability of the MuxBCT app. The study was designed to achieve a high degree of realism as participants carried out a scenario representing the context of use for the MuxBCT app. As the MuxBCT was under development, the scenario involved the use of molecular blood culture tests similar to the MuxBCT for identification of microorganisms from positive blood culture samples. The study participants were observed, and their interactions with the app were recorded. After the study, the participants were debriefed to clarify observations. Four medical laboratory technicians, for example, representative of end users of the app, participated in the clinical simulation study. Using the MuxBCT app, the study participants successfully identified and reported all microorganisms from the positive blood cultures examined. Three of the four participants reported that they found the app useful, while one study participant reported that she would prefer to make notes on paper and later enter them into the LIS. The preliminary usability evaluation results indicate that use of the MuxBCT tablet app can facilitate the workflow of the MuxBCT diagnostic test.

  1. The effects of leadership competencies and quality of work on the perceived readiness for organizational change among nurse managers.

    PubMed

    Al-Hussami, Mahmoud; Hamad, Sawsan; Darawad, Muhammad; Maharmeh, Mahmoud

    2017-10-02

    Purpose This paper aims to set a leadership guidance program that can promote nurses' knowledge of leadership and, at the same time, to enhance their leadership competencies and quality of work to promote their readiness for change in healthcare organizations. Design/methodology/approach A pre-experimental, one-group pretest-posttest design was utilized. Out of 90 invited to participate in this study, 61 nurses were accepted to participate. Findings The statistical analyses suggested several significant differences between pre- and in-service nurse managers about leadership competencies, quality of work and readiness for change. Yet, findings from the background characteristics were not found to be significant and had no effects on the perceived readiness for change. Research limitations/implications The present study highlights the importance of leadership competencies and quality of work that healthcare policymakers identify for the success of organizational change efforts. Practical implications Healthcare policymakers, including directors of nursing, should focus on applications that increase leadership competencies and overall satisfaction of the nurse managers to support the changes in hospitals and supporting learning organization. Hence, they should establish policies that decrease the possible negative impact of planned change efforts. Originality/value Competent nurse managers enhance their readiness for change, which in turn helps nurses in constructive change processes. A leadership guidance program should be set for nurse managers. This study has important implications for hospital administrators and directors of nursing.

  2. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  3. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  4. Differentiated protection services with failure probability guarantee for workflow-based applications

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2010-12-01

    A cost-effective and service-differentiated provisioning strategy is very desirable to service providers so that they can offer users satisfactory services, while optimizing network resource allocation. Providing differentiated protection services to connections for surviving link failure has been extensively studied in recent years. However, the differentiated protection services for workflow-based applications, which consist of many interdependent tasks, have scarcely been studied. This paper investigates the problem of providing differentiated services for workflow-based applications in optical grid. In this paper, we develop three differentiated protection services provisioning strategies which can provide security level guarantee and network-resource optimization for workflow-based applications. The simulation demonstrates that these heuristic algorithms provide protection cost-effectively while satisfying the applications' failure probability requirements.

  5. Open access to Water Indicators for Climate Change Adaptation: proof-of-concept for the Copernicus Climate Change Service (C3S)

    NASA Astrophysics Data System (ADS)

    Lottle, Lorna; Arheimer, Berit; Gyllensvärd, Frida; Dejong, Fokke; Ludwig, Fulco; Hutjes, Ronald; Martinez, Bernat

    2017-04-01

    Copernicus Climate Change Service (C3S) is still in the development phase and will combine observations of the climate system with the latest science to develop authoritative, quality-assured information about the past, current and future states of the climate and climate dependent sectors in Europe and worldwide. C3S will provide key indicators on climate change drivers and selected sectorial impacts. The aim of these indicators will be to support adaptation and mitigation. This presentation will show one service already operational as a proof-of-concept of this future climate service. The project "Service for Water Indicators in Climate Change Adaptation" (SWICCA) has developed a sectorial information service for water management. It offers readily available climate-impact data, for open access from the web-site http://swicca.climate.copernicus.eu/. The development is user-driven with the overall goal to speed up the workflow in climate-change adaptation of water management across Europe. The service is co-designed by consultant engineers and agencies in 15 case-studies spread out over the continent. SWICCA has an interactive user-interface, which shows maps and graphs, and facilitates data download in user-friendly formats. In total, more than 900 open dataset are given for various hydrometeorological (and a few socioeconomical) variables, model ensembles, resolutions, time-periods and RCPs. The service offers more than 40 precomputed climate impact indicators (CIIs) and transient time-series of 4 essential climate variables ECVs) with high spatial and temporal resolution. To facilitate both near future and far future assessments, SWICCA provides the indicators for different time ranges; normally, absolute values are given for a reference period (e.g. 1971-2000) and the expected future changes for different 30-year periods, such as early century (2011-2040), mid-century (2041-2070) and end-century (2071-2100). An ensemble of model results is always given to indicate confidence in the estimates. The SWICCA demonstrator also includes user guidance, information sheets, tutorials, and links to other relevant websites. The aim of this service is to provide research data and guidance for climate impact assessments in the water sector. The main target group is consulting engineers (so called Purveyors) working with climate change adaptation in the water sector. By using indicators, climate impact assessments can be done without having to run a full production chain from raw climate model results - instead the indicators can be included in the local workflow with local methods applied, to facilitate decision-making and strategies to meet the future. Working with real users will ensure that useful data is inserted into the C3S Climate Data Store (CDS).

  6. Multi-method laboratory user evaluation of an actionable clinical performance information system: Implications for usability and patient safety.

    PubMed

    Brown, Benjamin; Balatsoukas, Panos; Williams, Richard; Sperrin, Matthew; Buchan, Iain

    2018-01-01

    Electronic audit and feedback (e-A&F) systems are used worldwide for care quality improvement. They measure health professionals' performance against clinical guidelines, and some systems suggest improvement actions. However, little is known about optimal interface designs for e-A&F, in particular how to present suggested actions for improvement. We developed a novel theory-informed system for primary care (the Performance Improvement plaN GeneratoR; PINGR) that covers the four principal interface components: clinical performance summaries; patient lists; detailed patient-level information; and suggested actions. As far as we are aware, this is the first report of an e-A&F system with all four interface components. (1) Use a combination of quantitative and qualitative methods to evaluate the usability of PINGR with target end-users; (2) refine existing design recommendations for e-A&F systems; (3) determine the implications of these recommendations for patient safety. We recruited seven primary care physicians to perform seven tasks with PINGR, during which we measured on-screen behaviour and eye movements. Participants subsequently completed usability questionnaires, and were interviewed in-depth. Data were integrated to: gain a more complete understanding of usability issues; enhance and explain each other's findings; and triangulate results to increase validity. Participants committed a median of 10 errors (range 8-21) when using PINGR's interface, and completed a median of five out of seven tasks (range 4-7). Errors violated six usability heuristics: clear response options; perceptual grouping and data relationships; representational formats; unambiguous description; visually distinct screens for confusable items; and workflow integration. Eye movement analysis revealed the integration of components largely supported effective user workflow, although the modular design of clinical performance summaries unnecessarily increased cognitive load. Interviews and questionnaires revealed PINGR is user-friendly, and that improved information prioritisation could further promote useful user action. Comparing our results with the wider usability literature we refine a previously published set of interface design recommendations for e-A&F. The implications for patient safety are significant regarding: user engagement; actionability; and information prioritisation. Our results also support adopting multi-method approaches in usability studies to maximise issue discovery and the credibility of findings. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Intra-operative fiducial-based CT/fluoroscope image registration framework for image-guided robot-assisted joint fracture surgery.

    PubMed

    Dagnino, Giulio; Georgilas, Ioannis; Morad, Samir; Gibbons, Peter; Tarassoli, Payam; Atkins, Roger; Dogramadzi, Sanja

    2017-08-01

    Joint fractures must be accurately reduced minimising soft tissue damages to avoid negative surgical outcomes. To this regard, we have developed the RAFS surgical system, which allows the percutaneous reduction of intra-articular fractures and provides intra-operative real-time 3D image guidance to the surgeon. Earlier experiments showed the effectiveness of the RAFS system on phantoms, but also key issues which precluded its use in a clinical application. This work proposes a redesign of the RAFS's navigation system overcoming the earlier version's issues, aiming to move the RAFS system into a surgical environment. The navigation system is improved through an image registration framework allowing the intra-operative registration between pre-operative CT images and intra-operative fluoroscopic images of a fractured bone using a custom-made fiducial marker. The objective of the registration is to estimate the relative pose between a bone fragment and an orthopaedic manipulation pin inserted into it intra-operatively. The actual pose of the bone fragment can be updated in real time using an optical tracker, enabling the image guidance. Experiments on phantom and cadavers demonstrated the accuracy and reliability of the registration framework, showing a reduction accuracy (sTRE) of about [Formula: see text] (phantom) and [Formula: see text] (cadavers). Four distal femur fractures were successfully reduced in cadaveric specimens using the improved navigation system and the RAFS system following the new clinical workflow (reduction error [Formula: see text], [Formula: see text]. Experiments showed the feasibility of the image registration framework. It was successfully integrated into the navigation system, allowing the use of the RAFS system in a realistic surgical application.

  8. Comparative ergonomic workflow and user experience analysis of MRI versus fluoroscopy-guided vascular interventions: an iliac angioplasty exemplar case study.

    PubMed

    Fernández-Gutiérrez, Fabiola; Martínez, Santiago; Rube, Martin A; Cox, Benjamin F; Fatahi, Mahsa; Scott-Brown, Kenneth C; Houston, J Graeme; McLeod, Helen; White, Richard D; French, Karen; Gueorguieva, Mariana; Immel, Erwin; Melzer, Andreas

    2015-10-01

    A methodological framework is introduced to assess and compare a conventional fluoroscopy protocol for peripheral angioplasty with a new magnetic resonant imaging (MRI)-guided protocol. Different scenarios were considered during interventions on a perfused arterial phantom with regard to time-based and cognitive task analysis, user experience and ergonomics. Three clinicians with different expertise performed a total of 43 simulated common iliac angioplasties (9 fluoroscopic, 34 MRI-guided) in two blocks of sessions. Six different configurations for MRI guidance were tested in the first block. Four of them were evaluated in the second block and compared to the fluoroscopy protocol. Relevant stages' durations were collected, and interventions were audio-visually recorded from different perspectives. A cued retrospective protocol analysis (CRPA) was undertaken, including personal interviews. In addition, ergonomic constraints in the MRI suite were evaluated. Significant differences were found when comparing the performance between MRI configurations versus fluoroscopy. Two configurations [with times of 8.56 (0.64) and 9.48 (1.13) min] led to reduce procedure time for MRI guidance, comparable to fluoroscopy [8.49 (0.75) min]. The CRPA pointed out the main influential factors for clinical procedure performance. The ergonomic analysis quantified musculoskeletal risks for interventional radiologists when utilising MRI. Several alternatives were suggested to prevent potential low-back injuries. This work presents a step towards the implementation of efficient operational protocols for MRI-guided procedures based on an integral and multidisciplinary framework, applicable to the assessment of current vascular protocols. The use of first-user perspective raises the possibility of establishing new forms of clinical training and education.

  9. Retractor-induced brain shift compensation in image-guided neurosurgery

    NASA Astrophysics Data System (ADS)

    Fan, Xiaoyao; Ji, Songbai; Hartov, Alex; Roberts, David; Paulsen, Keith

    2013-03-01

    In image-guided neurosurgery, intraoperative brain shift significantly degrades the accuracy of neuronavigation that is solely based on preoperative magnetic resonance images (pMR). To compensate for brain deformation and to maintain the accuracy in image guidance achieved at the start of surgery, biomechanical models have been developed to simulate brain deformation and to produce model-updated MR images (uMR) to compensate for brain shift. To-date, most studies have focused on shift compensation at early stages of surgery (i.e., updated images are only produced after craniotomy and durotomy). Simulating surgical events at later stages such as retraction and tissue resection are, perhaps, clinically more relevant because of the typically much larger magnitudes of brain deformation. However, these surgical events are substantially more complex in nature, thereby posing significant challenges in model-based brain shift compensation strategies. In this study, we present results from an initial investigation to simulate retractor-induced brain deformation through a biomechanical finite element (FE) model where whole-brain deformation assimilated from intraoperative data was used produce uMR for improved accuracy in image guidance. Specifically, intensity-encoded 3D surface profiles at the exposed cortical area were reconstructed from intraoperative stereovision (iSV) images before and after tissue retraction. Retractor-induced surface displacements were then derived by coregistering the surfaces and served as sparse displacement data to drive the FE model. With one patient case, we show that our technique is able to produce uMR that agrees well with the reconstructed iSV surface after retraction. The computational cost to simulate retractor-induced brain deformation was approximately 10 min. In addition, our approach introduces minimal interruption to the surgical workflow, suggesting the potential for its clinical application.

  10. The European Society of Therapeutic Radiology and Oncology-European Institute of Radiotherapy (ESTRO-EIR) report on 3D CT-based in-room image guidance systems: a practical and technical review and guide.

    PubMed

    Korreman, Stine; Rasch, Coen; McNair, Helen; Verellen, Dirk; Oelfke, Uwe; Maingon, Philippe; Mijnheer, Ben; Khoo, Vincent

    2010-02-01

    The past decade has provided many technological advances in radiotherapy. The European Institute of Radiotherapy (EIR) was established by the European Society of Therapeutic Radiology and Oncology (ESTRO) to provide current consensus statement with evidence-based and pragmatic guidelines on topics of practical relevance for radiation oncology. This report focuses primarily on 3D CT-based in-room image guidance (3DCT-IGRT) systems. It will provide an overview and current standing of 3DCT-IGRT systems addressing the rationale, objectives, principles, applications, and process pathways, both clinical and technical for treatment delivery and quality assurance. These are reviewed for four categories of solutions; kV CT and kV CBCT (cone-beam CT) as well as MV CT and MV CBCT. It will also provide a framework and checklist to consider the capability and functionality of these systems as well as the resources needed for implementation. Two different but typical clinical cases (tonsillar and prostate cancer) using 3DCT-IGRT are illustrated with workflow processes via feedback questionnaires from several large clinical centres currently utilizing these systems. The feedback from these clinical centres demonstrates a wide variability based on local practices. This report whilst comprehensive is not exhaustive as this area of development remains a very active field for research and development. However, it should serve as a practical guide and framework for all professional groups within the field, focussed on clinicians, physicists and radiation therapy technologists interested in IGRT. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Development of an autonomous treatment planning strategy for radiation therapy with effective use of population-based prior data.

    PubMed

    Wang, Huan; Dong, Peng; Liu, Hongcheng; Xing, Lei

    2017-02-01

    Current treatment planning remains a costly and labor intensive procedure and requires multiple trial-and-error adjustments of system parameters such as the weighting factors and prescriptions. The purpose of this work is to develop an autonomous treatment planning strategy with effective use of prior knowledge and in a clinically realistic treatment planning platform to facilitate radiation therapy workflow. Our technique consists of three major components: (i) a clinical treatment planning system (TPS); (ii) a formulation of decision-function constructed using an assemble of prior treatment plans; (iii) a plan evaluator or decision-function and an outer-loop optimization independent of the clinical TPS to assess the TPS-generated plan and to drive the search toward a solution optimizing the decision-function. Microsoft (MS) Visual Studio Coded UI is applied to record some common planner-TPS interactions as subroutines for querying and interacting with the TPS. These subroutines are called back in the outer-loop optimization program to navigate the plan selection process through the solution space iteratively. The utility of the approach is demonstrated by using clinical prostate and head-and-neck cases. An autonomous treatment planning technique with effective use of an assemble of prior treatment plans is developed to automatically maneuver the clinical treatment planning process in the platform of a commercial TPS. The process mimics the decision-making process of a human planner and provides a clinically sensible treatment plan automatically, thus reducing/eliminating the tedious manual trial-and-errors of treatment planning. It is found that the prostate and head-and-neck treatment plans generated using the approach compare favorably with that used for the patients' actual treatments. Clinical inverse treatment planning process can be automated effectively with the guidance of an assemble of prior treatment plans. The approach has the potential to significantly improve the radiation therapy workflow. © 2016 American Association of Physicists in Medicine.

  12. MO-E-BRD-01: Is Non-Invasive Image-Guided Breast Brachytherapy Good?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiatt, J.

    2015-06-15

    Is Non-invasive Image-Guided Breast Brachytherapy Good? – Jess Hiatt, MS Non-invasive Image-Guided Breast Brachytherapy (NIBB) is an emerging therapy for breast boost treatments as well as Accelerated Partial Breast Irradiation (APBI) using HDR surface breast brachytherapy. NIBB allows for smaller treatment volumes while maintaining optimal target coverage. Considering the real-time image-guidance and immobilization provided by the NIBB modality, minimal margins around the target tissue are necessary. Accelerated Partial Breast Irradiation in brachytherapy: is shorter better? - Dorin Todor, PhD VCU A review of balloon and strut devices will be provided together with the origins of APBI: the interstitial multi-catheter implant.more » A dosimetric and radiobiological perspective will help point out the evolution in breast brachytherapy, both in terms of devices and the protocols/clinical trials under which these devices are used. Improvements in imaging, delivery modalities and convenience are among the factors driving the ultrashort fractionation schedules but our understanding of both local control and toxicities associated with various treatments is lagging. A comparison between various schedules, from a radiobiological perspective, will be given together with a critical analysis of the issues. to review and understand the evolution and development of APBI using brachytherapy methods to understand the basis and limitations of radio-biological ‘equivalence’ between fractionation schedules to review commonly used and proposed fractionation schedules Intra-operative breast brachytherapy: Is one stop shopping best?- Bruce Libby, PhD. University of Virginia A review of intraoperative breast brachytherapy will be presented, including the Targit-A and other trials that have used electronic brachytherapy. More modern approaches, in which the lumpectomy procedure is integrated into an APBI workflow, will also be discussed. Learning Objectives: To review past and current clinical trials for IORT To discuss lumpectomy-scan-plan-treat workflow for IORT.« less

  13. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaffino, Paolo; Spadea, Maria Francesca

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, wheremore » a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against other segmentation algorithms.« less

  14. MO-E-BRD-03: Intra-Operative Breast Brachytherapy: Is One Stop Shopping Best? [Non-invasive Image-Guided Breast Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Libby, B.

    2015-06-15

    Is Non-invasive Image-Guided Breast Brachytherapy Good? – Jess Hiatt, MS Non-invasive Image-Guided Breast Brachytherapy (NIBB) is an emerging therapy for breast boost treatments as well as Accelerated Partial Breast Irradiation (APBI) using HDR surface breast brachytherapy. NIBB allows for smaller treatment volumes while maintaining optimal target coverage. Considering the real-time image-guidance and immobilization provided by the NIBB modality, minimal margins around the target tissue are necessary. Accelerated Partial Breast Irradiation in brachytherapy: is shorter better? - Dorin Todor, PhD VCU A review of balloon and strut devices will be provided together with the origins of APBI: the interstitial multi-catheter implant.more » A dosimetric and radiobiological perspective will help point out the evolution in breast brachytherapy, both in terms of devices and the protocols/clinical trials under which these devices are used. Improvements in imaging, delivery modalities and convenience are among the factors driving the ultrashort fractionation schedules but our understanding of both local control and toxicities associated with various treatments is lagging. A comparison between various schedules, from a radiobiological perspective, will be given together with a critical analysis of the issues. to review and understand the evolution and development of APBI using brachytherapy methods to understand the basis and limitations of radio-biological ‘equivalence’ between fractionation schedules to review commonly used and proposed fractionation schedules Intra-operative breast brachytherapy: Is one stop shopping best?- Bruce Libby, PhD. University of Virginia A review of intraoperative breast brachytherapy will be presented, including the Targit-A and other trials that have used electronic brachytherapy. More modern approaches, in which the lumpectomy procedure is integrated into an APBI workflow, will also be discussed. Learning Objectives: To review past and current clinical trials for IORT To discuss lumpectomy-scan-plan-treat workflow for IORT.« less

  15. MO-E-BRD-02: Accelerated Partial Breast Irradiation in Brachytherapy: Is Shorter Better?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todor, D.

    2015-06-15

    Is Non-invasive Image-Guided Breast Brachytherapy Good? – Jess Hiatt, MS Non-invasive Image-Guided Breast Brachytherapy (NIBB) is an emerging therapy for breast boost treatments as well as Accelerated Partial Breast Irradiation (APBI) using HDR surface breast brachytherapy. NIBB allows for smaller treatment volumes while maintaining optimal target coverage. Considering the real-time image-guidance and immobilization provided by the NIBB modality, minimal margins around the target tissue are necessary. Accelerated Partial Breast Irradiation in brachytherapy: is shorter better? - Dorin Todor, PhD VCU A review of balloon and strut devices will be provided together with the origins of APBI: the interstitial multi-catheter implant.more » A dosimetric and radiobiological perspective will help point out the evolution in breast brachytherapy, both in terms of devices and the protocols/clinical trials under which these devices are used. Improvements in imaging, delivery modalities and convenience are among the factors driving the ultrashort fractionation schedules but our understanding of both local control and toxicities associated with various treatments is lagging. A comparison between various schedules, from a radiobiological perspective, will be given together with a critical analysis of the issues. to review and understand the evolution and development of APBI using brachytherapy methods to understand the basis and limitations of radio-biological ‘equivalence’ between fractionation schedules to review commonly used and proposed fractionation schedules Intra-operative breast brachytherapy: Is one stop shopping best?- Bruce Libby, PhD. University of Virginia A review of intraoperative breast brachytherapy will be presented, including the Targit-A and other trials that have used electronic brachytherapy. More modern approaches, in which the lumpectomy procedure is integrated into an APBI workflow, will also be discussed. Learning Objectives: To review past and current clinical trials for IORT To discuss lumpectomy-scan-plan-treat workflow for IORT.« less

  16. Implementation of workflow engine technology to deliver basic clinical decision support functionality.

    PubMed

    Huser, Vojtech; Rasmussen, Luke V; Oberg, Ryan; Starren, Justin B

    2011-04-10

    Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform.

  17. A knowledge-based decision support system in bioinformatics: an application to protein complex extraction

    PubMed Central

    2013-01-01

    Background We introduce a Knowledge-based Decision Support System (KDSS) in order to face the Protein Complex Extraction issue. Using a Knowledge Base (KB) coding the expertise about the proposed scenario, our KDSS is able to suggest both strategies and tools, according to the features of input dataset. Our system provides a navigable workflow for the current experiment and furthermore it offers support in the configuration and running of every processing component of that workflow. This last feature makes our system a crossover between classical DSS and Workflow Management Systems. Results We briefly present the KDSS' architecture and basic concepts used in the design of the knowledge base and the reasoning component. The system is then tested using a subset of Saccharomyces cerevisiae Protein-Protein interaction dataset. We used this subset because it has been well studied in literature by several research groups in the field of complex extraction: in this way we could easily compare the results obtained through our KDSS with theirs. Our system suggests both a preprocessing and a clustering strategy, and for each of them it proposes and eventually runs suited algorithms. Our system's final results are then composed of a workflow of tasks, that can be reused for other experiments, and the specific numerical results for that particular trial. Conclusions The proposed approach, using the KDSS' knowledge base, provides a novel workflow that gives the best results with regard to the other workflows produced by the system. This workflow and its numeric results have been compared with other approaches about PPI network analysis found in literature, offering similar results. PMID:23368995

  18. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    PubMed

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  20. Systematic Review of Medical Informatics-Supported Medication Decision Making.

    PubMed

    Melton, Brittany L

    2017-01-01

    This systematic review sought to assess the applications and implications of current medical informatics-based decision support systems related to medication prescribing and use. Studies published between January 2006 and July 2016 which were indexed in PubMed and written in English were reviewed, and 39 studies were ultimately included. Most of the studies looked at computerized provider order entry or clinical decision support systems. Most studies examined decision support systems as a means of reducing errors or risk, particularly associated with medication prescribing, whereas a few studies evaluated the impact medical informatics-based decision support systems have on workflow or operations efficiency. Most studies identified benefits associated with decision support systems, but some indicate there is room for improvement.

  1. INA-Rxiv: The Missing Puzzle in Indonesia’s Scientific Publishing Workflow

    NASA Astrophysics Data System (ADS)

    Rahim, R.; Irawan, D. E.; Zulfikar, A.; Hardi, R.; Arliman S, L.; Gultom, E. R.; Ginting, G.; Wahyuni, S. S.; Mesran, M.; Mahjudin, M.; Saputra, I.; Waruwu, F. T.; Suginam, S.; Buulolo, E.; Abraham, J.

    2018-04-01

    INA-Rxiv is the first Indonesia preprint server marking the new development initiated by the open science community. This study aimed at describing the development of INA-Rxiv and its conversations. It usedanalyzer of Inarxiv.id, WhatsApp Group Analyzer, and Twitter Analytics as the tools for data analysis complemented with observation.The results showed that INA-Rxiv users are growing because of the numerous discussions in social media, e.g.WhatsApp,as well as some other positive response of writers who have been using INA- Rxiv. The perspective of growth mindset and the implication of INA-Rxiv movement for filling up the gap in accelerating scientific dissemination process are presented at the end of this article.

  2. Job-sharing in nuclear medicine: an 8-year experience (1998-2006).

    PubMed

    Als, Claudine; Brautigam, Peter

    2006-01-01

    Job-sharing is generally defined as a situation in which a single professional position is held in common by two separate individuals, who alternatively, on a timely basis, deal with the workload and the responsibilities. The aim of the present paper is to discuss prerequisites and characteristics of job-sharing by medical doctors and implications in a department of nuclear medicine. Job-sharing facilitates the combination of family life with professional occupation and prevents burnout. The time schedule applied by job-sharers is relevant: will both partners work for half-days, half-weeks, or rather alternatively during one to two consecutive weeks? This crucial choice, depending on personal as well as on professional circumstances, certainly influences the workflow of the department.

  3. A Parental Health Education Model of Children's Food Consumption: Influence on Children's Attitudes, Intention, and Consumption of Healthy and Unhealthy Foods.

    PubMed

    Lwin, May O; Shin, Wonsun; Yee, Andrew Z H; Wardoyo, Reidinar Juliane

    2017-05-01

    This study proposes that parental mediation of television advertising and parental guidance of food consumption differentially influence children's attitude, intention, and behavior toward the consumption of healthy and unhealthy foods. Structural equation modeling based on a survey of 1,119 children aged 9-12 supported our model, revealing that parental education strategies influence children's food consumption in a complex manner that is highly context-dependent. Parental guidance of food consumption enhanced children's healthy food attitude and intention to consume, while reducing the intention to consume unhealthy food. However, parental mediation of television advertising influenced unhealthy food attitude to a greater extent than healthy food attitude. Implications for health promotion and education, as well as parents and policy makers are discussed.

  4. Destination bedside: using research findings to visualize optimal unit layouts and health information technology in support of bedside care.

    PubMed

    Watkins, Nicholas; Kennedy, Mary; Lee, Nelson; O'Neill, Michael; Peavey, Erin; Ducharme, Maria; Padula, Cynthia

    2012-05-01

    This study explored the impact of unit design and healthcare information technology (HIT) on nursing workflow and patient-centered care (PCC). Healthcare information technology and unit layout-related predictors of nursing workflow and PCC were measured during a 3-phase study involving questionnaires and work sampling methods. Stepwise multiple linear regressions demonstrated several HIT and unit layout-related factors that impact nursing workflow and PCC.

  5. Using location tracking data to assess efficiency in established clinical workflows.

    PubMed

    Meyer, Mark; Fairbrother, Pamela; Egan, Marie; Chueh, Henry; Sandberg, Warren S

    2006-01-01

    Location tracking systems are becoming more prevalent in clinical settings yet applications still are not common. We have designed a system to aid in the assessment of clinical workflow efficiency. Location data is captured from active RFID tags and processed into usable data. These data are stored and presented visually with trending capability over time. The system allows quick assessments of the impact of process changes on workflow, and isolates areas for improvement.

  6. Explore Care Pathways of Colorectal Cancer Patients with Social Network Analysis.

    PubMed

    Huo, Tianyao; George, Thomas J; Guo, Yi; He, Zhe; Prosperi, Mattia; Modave, François; Bian, Jiang

    2017-01-01

    Patients with colorectal cancer (CRC) often face treatment delays and the exact reasons have not been well studied. This study is to explore clinical workflow patterns for CRC patients using electronic health records (EHR). In particular, we modeled the clinical workflow (provider-provider interactions) of a CRC patient's workup period as a social network, and identified clusters of workflow patterns based on network characteristics. Understanding of these patterns will help guide healthcare policy-making and practice.

  7. Aligning HST Images to Gaia: A Faster Mosaicking Workflow

    NASA Astrophysics Data System (ADS)

    Bajaj, V.

    2017-11-01

    We present a fully programmatic workflow for aligning HST images using the high-quality astrometry provided by Gaia Data Release 1. Code provided in a Jupyter Notebook works through this procedure, including parsing the data to determine the query area parameters, querying Gaia for the coordinate catalog, and using the catalog with TweakReg as reference catalog. This workflow greatly simplifies the normally time-consuming process of aligning HST images, especially those taken as part of mosaics.

  8. msCompare: A Framework for Quantitative Analysis of Label-free LC-MS Data for Comparative Candidate Biomarker Studies*

    PubMed Central

    Hoekman, Berend; Breitling, Rainer; Suits, Frank; Bischoff, Rainer; Horvatovich, Peter

    2012-01-01

    Data processing forms an integral part of biomarker discovery and contributes significantly to the ultimate result. To compare and evaluate various publicly available open source label-free data processing workflows, we developed msCompare, a modular framework that allows the arbitrary combination of different feature detection/quantification and alignment/matching algorithms in conjunction with a novel scoring method to evaluate their overall performance. We used msCompare to assess the performance of workflows built from modules of publicly available data processing packages such as SuperHirn, OpenMS, and MZmine and our in-house developed modules on peptide-spiked urine and trypsin-digested cerebrospinal fluid (CSF) samples. We found that the quality of results varied greatly among workflows, and interestingly, heterogeneous combinations of algorithms often performed better than the homogenous workflows. Our scoring method showed that the union of feature matrices of different workflows outperformed the original homogenous workflows in some cases. msCompare is open source software (https://trac.nbic.nl/mscompare), and we provide a web-based data processing service for our framework by integration into the Galaxy server of the Netherlands Bioinformatics Center (http://galaxy.nbic.nl/galaxy) to allow scientists to determine which combination of modules provides the most accurate processing for their particular LC-MS data sets. PMID:22318370

  9. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure

    PubMed Central

    2016-01-01

    Background Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. Objective The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. Methods We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. Results We identified 5 high-level macrocognitive processes affecting medication management—sensemaking, planning, coordination, monitoring, and decision making—and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Conclusions Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation. PMID:27733331

  10. Towards an intelligent hospital environment: OR of the future.

    PubMed

    Sutherland, Jeffrey V; van den Heuvel, Willem-Jan; Ganous, Tim; Burton, Matthew M; Kumar, Animesh

    2005-01-01

    Patients, providers, payers, and government demand more effective and efficient healthcare services, and the healthcare industry needs innovative ways to re-invent core processes. Business process reengineering (BPR) showed adopting new hospital information systems can leverage this transformation and workflow management technologies can automate process management. Our research indicates workflow technologies in healthcare require real time patient monitoring, detection of adverse events, and adaptive responses to breakdown in normal processes. Adaptive workflow systems are rarely implemented making current workflow implementations inappropriate for healthcare. The advent of evidence based medicine, guideline based practice, and better understanding of cognitive workflow combined with novel technologies including Radio Frequency Identification (RFID), mobile/wireless technologies, internet workflow, intelligent agents, and Service Oriented Architectures (SOA) opens up new and exciting ways of automating business processes. Total situational awareness of events, timing, and location of healthcare activities can generate self-organizing change in behaviors of humans and machines. A test bed of a novel approach towards continuous process management was designed for the new Weinburg Surgery Building at the University of Maryland Medical. Early results based on clinical process mapping and analysis of patient flow bottlenecks demonstrated 100% improvement in delivery of supplies and instruments at surgery start time. This work has been directly applied to the design of the DARPA Trauma Pod research program where robotic surgery will be performed on wounded soldiers on the battlefield.

  11. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGES

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  12. Impact of digital radiography on clinical workflow.

    PubMed

    May, G A; Deer, D D; Dackiewicz, D

    2000-05-01

    It is commonly accepted that digital radiography (DR) improves workflow and patient throughput compared with traditional film radiography or computed radiography (CR). DR eliminates the film development step and the time to acquire the image from a CR reader. In addition, the wide dynamic range of DR is such that the technologist can perform the quality-control (QC) step directly at the modality in a few seconds, rather than having to transport the newly acquired image to a centralized QC station for review. Furthermore, additional workflow efficiencies can be achieved with DR by employing tight radiology information system (RIS) integration. In the DR imaging environment, this provides for patient demographic information to be automatically downloaded from the RIS to populate the DR Digital Imaging and Communications in Medicine (DICOM) image header. To learn more about this workflow efficiency improvement, we performed a comparative study of workflow steps under three different conditions: traditional film/screen x-ray, DR without RIS integration (ie, manual entry of patient demographics), and DR with RIS integration. This study was performed at the Cleveland Clinic Foundation (Cleveland, OH) using a newly acquired amorphous silicon flat-panel DR system from Canon Medical Systems (Irvine, CA). Our data show that DR without RIS results in substantial workflow savings over traditional film/screen practice. There is an additional 30% reduction in total examination time using DR with RIS integration.

  13. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  14. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure.

    PubMed

    Mickelson, Robin S; Unertl, Kim M; Holden, Richard J

    2016-10-12

    Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. We identified 5 high-level macrocognitive processes affecting medication management-sensemaking, planning, coordination, monitoring, and decision making-and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation.

  15. Evaluation of an image-based tracking workflow with Kalman filtering for automatic image plane alignment in interventional MRI.

    PubMed

    Neumann, M; Cuvillon, L; Breton, E; de Matheli, M

    2013-01-01

    Recently, a workflow for magnetic resonance (MR) image plane alignment based on tracking in real-time MR images was introduced. The workflow is based on a tracking device composed of 2 resonant micro-coils and a passive marker, and allows for tracking of the passive marker in clinical real-time images and automatic (re-)initialization using the microcoils. As the Kalman filter has proven its benefit as an estimator and predictor, it is well suited for use in tracking applications. In this paper, a Kalman filter is integrated in the previously developed workflow in order to predict position and orientation of the tracking device. Measurement noise covariances of the Kalman filter are dynamically changed in order to take into account that, according to the image plane orientation, only a subset of the 3D pose components is available. The improved tracking performance of the Kalman extended workflow could be quantified in simulation results. Also, a first experiment in the MRI scanner was performed but without quantitative results yet.

  16. Digital transformation in home care. A case study.

    PubMed

    Bennis, Sandy; Costanzo, Diane; Flynn, Ann Marie; Reidy, Agatha; Tronni, Catherine

    2007-01-01

    Simply implementing software and technology does not assure that an organization's targeted clinical and financial goals will be realized. No longer is it possible to roll out a new system--by solely providing end user training and overlaying it on top of already inefficient workflows and outdated roles--and know with certainty that targets will be met. At Virtua Health's Home Care, based in south New Jersey, implementation of their electronic system initially followed this more traditional approach. Unable to completely attain their earlier identified return on investment, they enlisted the help of a new role within their health system, that of the nurse informaticist. Knowledgeable in complex clinical processes and not bound by the technology at hand, the informaticist analyzed physical workflow, digital workflow, roles and physical layout. Leveraging specific tools such as change acceleration, workouts and LEAN, the informaticist was able to redesign workflow and support new levels of functionality. This article provides a view from the "finish line", recounting how this role worked with home care to assimilate information delivery into more efficient processes and align resources to support the new workflow, ultimately achieving real tangible returns.

  17. A data-independent acquisition workflow for qualitative screening of new psychoactive substances in biological samples.

    PubMed

    Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N

    2015-11-01

    Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices.

  18. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    NASA Astrophysics Data System (ADS)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  19. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  20. Building an efficient curation workflow for the Arabidopsis literature corpus

    PubMed Central

    Li, Donghui; Berardini, Tanya Z.; Muller, Robert J.; Huala, Eva

    2012-01-01

    TAIR (The Arabidopsis Information Resource) is the model organism database (MOD) for Arabidopsis thaliana, a model plant with a literature corpus of about 39 000 articles in PubMed, with over 4300 new articles added in 2011. We have developed a literature curation workflow incorporating both automated and manual elements to cope with this flood of new research articles. The current workflow can be divided into two phases: article selection and curation. Structured controlled vocabularies, such as the Gene Ontology and Plant Ontology are used to capture free text information in the literature as succinct ontology-based annotations suitable for the application of computational analysis methods. We also describe our curation platform and the use of text mining tools in our workflow. Database URL: www.arabidopsis.org PMID:23221298

Top