A three-level atomicity model for decentralized workflow management systems
NASA Astrophysics Data System (ADS)
Ben-Shaul, Israel Z.; Heineman, George T.
1996-12-01
A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
Agile parallel bioinformatics workflow management using Pwrake.
Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro
2011-09-08
In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles
Agile parallel bioinformatics workflow management using Pwrake
2011-01-01
Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability
Kwf-Grid workflow management system for Earth science applications
NASA Astrophysics Data System (ADS)
Tran, V.; Hluchy, L.
2009-04-01
In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.
Responsibility Center Management: Lessons from 25 Years of Decentralized Management.
ERIC Educational Resources Information Center
Strauss, Jon C.; Curry, John R.
Decentralization of authority is a natural act in universities, but decentralization of responsibility is not. A problem faced by universities is the decoupling of academic authority from financial responsibility. The solution proposed in this book for the coupling is Responsibility Center Management (RCM), also called Revenue Responsibility…
Scientific Workflow Management in Proteomics
de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus
2012-01-01
Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703
Evolutionary Concepts for Decentralized Air Traffic Flow Management
NASA Technical Reports Server (NTRS)
Adams, Milton; Kolitz, Stephan; Milner, Joseph; Odoni, Amedeo
1997-01-01
Alternative concepts for modifying the policies and procedures under which the air traffic flow management system operates are described, and an approach to the evaluation of those concepts is discussed. Here, air traffic flow management includes all activities related to the management of the flow of aircraft and related system resources from 'block to block.' The alternative concepts represent stages in the evolution from the current system, in which air traffic management decision making is largely centralized within the FAA, to a more decentralized approach wherein the airlines and other airspace users collaborate in air traffic management decision making with the FAA. The emphasis in the discussion is on a viable medium-term partially decentralized scenario representing a phase of this evolution that is consistent with the decision-making approaches embodied in proposed Free Flight concepts for air traffic management. System-level metrics for analyzing and evaluating the various alternatives are defined, and a simulation testbed developed to generate values for those metrics is described. The fundamental issue of modeling airline behavior in decentralized environments is also raised, and an example of such a model, which deals with the preservation of flight bank integrity in hub airports, is presented.
Decentralized and Tactical Air Traffic Flow Management
NASA Technical Reports Server (NTRS)
Odoni, Amedeo R.; Bertsimas, Dimitris
1997-01-01
This project dealt with the following topics: 1. Review and description of the existing air traffic flow management system (ATFM) and identification of aspects with potential for improvement. 2. Identification and review of existing models and simulations dealing with all system segments (enroute, terminal area, ground) 3. Formulation of concepts for overall decentralization of the ATFM system, ranging from moderate decentralization to full decentralization 4. Specification of the modifications to the ATFM system required to accommodate each of the alternative concepts. 5. Identification of issues that need to be addressed with regard to: determination of the way the ATFM system would be operating; types of flow management strategies that would be used; and estimation of the effectiveness of ATFM with regard to reducing delay and re-routing costs. 6. Concept evaluation through identification of criteria and methodologies for accommodating the interests of stakeholders and of approaches to optimization of operational procedures for all segments of the ATFM system.
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
Decentralizing the Team Station: Simulation before Reality as a Best-Practice Approach.
Charko, Jackie; Geertsen, Alice; O'Brien, Patrick; Rouse, Wendy; Shahid, Ammarah; Hardenne, Denise
2016-01-01
The purpose of this article is to share the logistical planning requirements and simulation experience of one Canadian hospital as it prepared its staff for the change from a centralized inpatient unit model to the decentralized design planned for its new community hospital. With the commitment and support of senior leadership, project management resources and clinical leads worked collaboratively to design a decentralized prototype in the form of a pod-style environment in the hospital's current setting. Critical success factors included engaging the right stakeholders, providing an opportunity to test new workflows and technology, creating a strong communication plan and building on lessons learned as subsequent pod prototypes are launched.
Context-aware workflow management of mobile health applications.
Salden, Alfons; Poortinga, Remco
2006-01-01
We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.
Ecology Based Decentralized Agent Management System
NASA Technical Reports Server (NTRS)
Peysakhov, Maxim D.; Cicirello, Vincent A.; Regli, William C.
2004-01-01
The problem of maintaining a desired number of mobile agents on a network is not trivial, especially if we want a completely decentralized solution. Decentralized control makes a system more r e bust and less susceptible to partial failures. The problem is exacerbated on wireless ad hoc networks where host mobility can result in significant changes in the network size and topology. In this paper we propose an ecology-inspired approach to the management of the number of agents. The approach associates agents with living organisms and tasks with food. Agents procreate or die based on the abundance of uncompleted tasks (food). We performed a series of experiments investigating properties of such systems and analyzed their stability under various conditions. We concluded that the ecology based metaphor can be successfully applied to the management of agent populations on wireless ad hoc networks.
Managing and Communicating Operational Workflow
Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.
2016-01-01
Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407
Schedule-Aware Workflow Management Systems
NASA Astrophysics Data System (ADS)
Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.
Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.
Engaging Social Capital for Decentralized Urban Stormwater Management
Decentralized approaches to urban stormwater management, whereby installations of green infrastructure (e.g., rain gardens, bioswales, and constructed wetlands) are dispersed throughout a management area, are cost-effective solutions with co-benefits beyond water abatement. Inste...
High-volume workflow management in the ITN/FBI system
NASA Astrophysics Data System (ADS)
Paulson, Thomas L.
1997-02-01
The Identification Tasking and Networking (ITN) Federal Bureau of Investigation system will manage the processing of more than 70,000 submissions per day. The workflow manager controls the routing of each submission through a combination of automated and manual processing steps whose exact sequence is dynamically determined by the results at each step. For most submissions, one or more of the steps involve the visual comparison of fingerprint images. The ITN workflow manager is implemented within a scaleable client/server architecture. The paper describes the key aspects of the ITN workflow manager design which allow the high volume of daily processing to be successfully accomplished.
A Model of Workflow Composition for Emergency Management
NASA Astrophysics Data System (ADS)
Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu
The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.
An Auto-management Thesis Program WebMIS Based on Workflow
NASA Astrophysics Data System (ADS)
Chang, Li; Jie, Shi; Weibo, Zhong
An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience
Stockton, David B.; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.
Stockton, David B; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.
Kolehmainen-Aitken, Riitta-Liisa
2004-01-01
Designers and implementers of decentralization and other reform measures have focused much attention on financial and structural reform measures, but ignored their human resource implications. Concern is mounting about the impact that the reallocation of roles and responsibilities has had on the health workforce and its management, but the experiences and lessons of different countries have not been widely shared. This paper examines evidence from published literature on decentralization's impact on the demand side of the human resource equation, as well as the factors that have contributed to the impact. The elements that make such an impact analysis exceptionally complex are identified. They include the mode of decentralization that a country is implementing, the level of responsibility for the salary budget and pay determination, and the civil service status of transferred health workers. The main body of the paper is devoted to examining decentralization's impact on human resource issues from three different perspectives: that of local health managers, health workers themselves, and national health leaders. These three groups have different concerns in the human resource realm, and consequently, have been differently affected by decentralization processes. The paper concludes with recommendations regarding three key concerns that national authorities and international agencies should give prompt attention to. They are (1) defining the essential human resource policy, planning and management skills for national human resource managers who work in decentralized countries, and developing training programs to equip them with such skills; (2) supporting research that focuses on improving the knowledge base of how different modes of decentralization impact on staffing equity; and (3) identifying factors that most critically influence health worker motivation and performance under decentralization, and documenting the most cost-effective best practices to improve them
Workflow Management for Complex HEP Analyses
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.
2017-10-01
We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.
From chart tracking to workflow management.
Srinivasan, P.; Vignes, G.; Venable, C.; Hazelwood, A.; Cade, T.
1994-01-01
The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system. PMID:7950051
Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure.
Mickelson, Robin S; Unertl, Kim M; Holden, Richard J
2016-10-12
Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. We identified 5 high-level macrocognitive processes affecting medication management-sensemaking, planning, coordination, monitoring, and decision making-and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation.
Financial management systems under decentralization and their effect on malaria control in Uganda.
Kivumbi, George W; Nangendo, Florence; Ndyabahika, Boniface Rutagira
2004-01-01
A descriptive case study with multiple sites and a single level of analysis was carried out in four purposefully selected administrative districts of Uganda to investigate the effect of financial management systems under decentralization on malaria control. Data were primarily collected from 36 interviews with district managers, staff at health units and local leaders. A review of records and documents related to decentralization at the central and district level was also used to generate data for the study. We found that a long, tedious, and bureaucratic process combined with lack of knowledge in working with new financial systems by several actors characterized financial flow under decentralization. This affected the timely use of financial resources for malaria control in that there were funds in the system that could not be accessed for use. We were also told that sometimes these funds were returned to the central government because of non-use due to difficulties in accessing them and/or stringent conditions not to divert them to other uses. Our data showed that a cocktail of bureaucratic control systems, corruption and incompetence make the financial management system under decentralization counter-productive for malaria control. The main conclusion is that good governance through appropriate and efficient financial management systems is very important for effective malaria control under decentralization.
Massoud, May A; Tarhini, Akram; Nasr, Joumana A
2009-01-01
Providing reliable and affordable wastewater treatment in rural areas is a challenge in many parts of the world, particularly in developing countries. The problems and limitations of the centralized approaches for wastewater treatment are progressively surfacing. Centralized wastewater collection and treatment systems are costly to build and operate, especially in areas with low population densities and dispersed households. Developing countries lack both the funding to construct centralized facilities and the technical expertise to manage and operate them. Alternatively, the decentralized approach for wastewater treatment which employs a combination of onsite and/or cluster systems is gaining more attention. Such an approach allows for flexibility in management, and simple as well as complex technologies are available. The decentralized system is not only a long-term solution for small communities but is more reliable and cost effective. This paper presents a review of the various decentralized approaches to wastewater treatment and management. A discussion as to their applicability in developing countries, primarily in rural areas, and challenges faced is emphasized all through the paper. While there are many impediments and challenges towards wastewater management in developing countries, these can be overcome by suitable planning and policy implementation. Understanding the receiving environment is crucial for technology selection and should be accomplished by conducting a comprehensive site evaluation process. Centralized management of the decentralized wastewater treatment systems is essential to ensure they are inspected and maintained regularly. Management strategies should be site specific accounting for social, cultural, environmental and economic conditions in the target area.
Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure
2016-01-01
Background Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. Objective The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. Methods We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. Results We identified 5 high-level macrocognitive processes affecting medication management—sensemaking, planning, coordination, monitoring, and decision making—and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Conclusions Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation. PMID:27733331
Decentralized asset management for collaborative sensing
NASA Astrophysics Data System (ADS)
Malhotra, Raj P.; Pribilski, Michael J.; Toole, Patrick A.; Agate, Craig
2017-05-01
There has been increased impetus to leverage Small Unmanned Aerial Systems (SUAS) for collaborative sensing applications in which many platforms work together to provide critical situation awareness in dynamic environments. Such applications require critical sensor observations to be made at the right place and time to facilitate the detection, tracking, and classification of ground-based objects. This further requires rapid response to real-world events and the balancing of multiple, competing mission objectives. In this context, human operators become overwhelmed with management of many platforms. Further, current automated planning paradigms tend to be centralized and don't scale up well to many collaborating platforms. We introduce a decentralized approach based upon information-theory and distributed fusion which enable us to scale up to large numbers of collaborating Small Unmanned Aerial Systems (SUAS) platforms. This is exercised against a military application involving the autonomous detection, tracking, and classification of critical mobile targets. We further show that, based upon monte-carlo simulation results, our decentralized approach out-performs more static management strategies employed by human operators and achieves similar results to a centralized approach while being scalable and robust to degradation of communication. Finally, we describe the limitations of our approach and future directions for our research.
Nexus: A modular workflow management system for quantum simulation codes
NASA Astrophysics Data System (ADS)
Krogel, Jaron T.
2016-01-01
The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.
EVALUATION OF ECONOMIC INCENTIVES FOR DECENTRALIZED STORMWATER RUNOFF MANAGEMENT
Impervious surfaces in urban and suburban areas can lead to excess stormwater runoff throughout a watershed, typically resulting in widespread hydrologic and ecological alteration of receiving streams. Decentralized stormwater management may improve stream ecosystems by reducing ...
Nexus: a modular workflow management system for quantum simulation codes
Krogel, Jaron T.
2015-08-24
The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less
Workflow computing. Improving management and efficiency of pathology diagnostic services.
Buffone, G J; Moreau, D; Beck, J R
1996-04-01
Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.
A Tool Supporting Collaborative Data Analytics Workflow Design and Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Bao, Q.; Lee, T. J.
2016-12-01
Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.
Content and Workflow Management for Library Websites: Case Studies
ERIC Educational Resources Information Center
Yu, Holly, Ed.
2005-01-01
Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…
NASA Astrophysics Data System (ADS)
Inguane, Ronaldo; Gallego-Ayala, Jordi; Juízo, Dinis
In the context of integrated water resources management implementation, the decentralization of water resources management (DWRM) at the river basin level is a crucial aspect for its success. However, decentralization requires the creation of new institutions on the ground, to stimulate an environment enabling stakeholder participation and integration into the water management decision-making process. In 1991, Mozambique began restructuring its water sector toward operational decentralized water resources management. Within this context of decentralization, new legal and institutional frameworks have been created, e.g., Regional Water Administrations (RWAs) and River Basin Committees. This paper identifies and analyzes the key institutional challenges and opportunities of DWRM implementation in Mozambique. The paper uses a critical social science research methodology for in-depth analysis of the roots of the constraining factors for the implementation of DWRM. The results obtained suggest that RWAs should be designed considering the specific geographic and infrastructural conditions of their jurisdictional areas and that priorities should be selected in their institutional capacity building strategies that match local realities. Furthermore, the results also indicate that RWAs have enjoyed limited support from basin stakeholders, mainly in basins with less hydraulic infrastructure, in securing water availability for their users and minimizing the effect of climate variability.
Responsibility Center Management: A Financial Paradigm and Alternative to Decentralized Budgeting.
ERIC Educational Resources Information Center
Hensley, Phyllis A.; Bava, D. J.; Brennan, Denise C.
This study examined the implementation of Responsibility Center Management (RCM) systems in two institutions of higher education: the Graduate School of Business at Institution and the Center of Collaborative Education and Professional Studies at Institution B. RCM is a management and budgeting process for universities that decentralizes authority…
Scientific Workflows + Provenance = Better (Meta-)Data Management
NASA Astrophysics Data System (ADS)
Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.
2013-12-01
The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata
Common Workflow Service: Standards Based Solution for Managing Operational Processes
NASA Astrophysics Data System (ADS)
Tinio, A. W.; Hollins, G. A.
2017-06-01
The Common Workflow Service is a collaborative and standards-based solution for managing mission operations processes using techniques from the Business Process Management (BPM) discipline. This presentation describes the CWS and its benefits.
A patient workflow management system built on guidelines.
Dazzi, L.; Fassino, C.; Saracco, R.; Quaglini, S.; Stefanelli, M.
1997-01-01
To provide high quality, shared, and distributed medical care, clinical and organizational issues need to be integrated. This work describes a methodology for developing a Patient Workflow Management System, based on a detailed model of both the medical work process and the organizational structure. We assume that the medical work process is represented through clinical practice guidelines, and that an ontological description of the organization is available. Thus, we developed tools 1) for acquiring the medical knowledge contained into a guideline, 2) to translate the derived formalized guideline into a computational formalism, precisely a Petri Net, 3) to maintain different representation levels. The high level representation guarantees that the Patient Workflow follows the guideline prescriptions, while the low level takes into account the specific organization characteristics and allow allocating resources for managing a specific patient in daily practice. PMID:9357606
A Two-Stage Probabilistic Approach to Manage Personal Worklist in Workflow Management Systems
NASA Astrophysics Data System (ADS)
Han, Rui; Liu, Yingbo; Wen, Lijie; Wang, Jianmin
The application of workflow scheduling in managing individual actor's personal worklist is one area that can bring great improvement to business process. However, current deterministic work cannot adapt to the dynamics and uncertainties in the management of personal worklist. For such an issue, this paper proposes a two-stage probabilistic approach which aims at assisting actors to flexibly manage their personal worklists. To be specific, the approach analyzes every activity instance's continuous probability of satisfying deadline at the first stage. Based on this stochastic analysis result, at the second stage, an innovative scheduling strategy is proposed to minimize the overall deadline violation cost for an actor's personal worklist. Simultaneously, the strategy recommends the actor a feasible worklist of activity instances which meet the required bottom line of successful execution. The effectiveness of our approach is evaluated in a real-world workflow management system and with large scale simulation experiments.
Lessons from implementing a combined workflow-informatics system for diabetes management.
Zai, Adrian H; Grant, Richard W; Estey, Greg; Lester, William T; Andrews, Carl T; Yee, Ronnie; Mort, Elizabeth; Chueh, Henry C
2008-01-01
Shortcomings surrounding the care of patients with diabetes have been attributed largely to a fragmented, disorganized, and duplicative health care system that focuses more on acute conditions and complications than on managing chronic disease. To address these shortcomings, we developed a diabetes registry population management application to change the way our staff manages patients with diabetes. Use of this new application has helped us coordinate the responsibilities for intervening and monitoring patients in the registry among different users. Our experiences using this combined workflow-informatics intervention system suggest that integrating a chronic disease registry into clinical workflow for the treatment of chronic conditions creates a useful and efficient tool for managing disease.
NASA Astrophysics Data System (ADS)
McCarthy, Ann
2006-01-01
The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.
Engaging Social Capital for Decentralized Urban Stormwater Management (Paper in Non-EPA Proceedings)
Decentralized approaches to urban stormwater management, whereby installations of green infrastructure (e.g., rain gardens, bioswales, constructed wetlands) are dispersed throughout a management area, are cost-effective solutions with co-benefits beyond just water abatement. Inst...
Leadership in Decentralized Schools.
ERIC Educational Resources Information Center
Madsen, Jean
1997-01-01
Summarizes a study that examined principals' leadership in three private schools and its implications for decentralized public schools. With the increase of charter and privatized managed schools, principals will need to redefine their leadership styles. Private schools, as decentralized entities, offer useful perspectives on developing school…
Collective and decentralized management model in public hospitals: perspective of the nursing team.
Bernardes, Andrea; Cecilio, Luiz Carlos de Oliveira; Evora, Yolanda Dora Martinez; Gabriel, Carmen Silvia; Carvalho, Mariana Bernardes de
2011-01-01
This research aims to present the implementation of the collective and decentralized management model in functional units of a public hospital in the city of Ribeirão Preto, state of São Paulo, according to the view of the nursing staff and the health technical assistant. This historical and organizational case study used qualitative thematic content analysis proposed by Bardin for data analysis. The institution started the decentralization of its administrative structure in 1999, through collective management, which permitted several internal improvements, with positive repercussion for the care delivered to users. The top-down implementation of the process seems to have jeopardized workers adherence, although collective management has intensified communication and the sharing of power and decision. The study shows that there is still much work to be done to concretize this innovative management proposal, despite the advances regarding the quality of care.
NASA Astrophysics Data System (ADS)
Leibovici, D. G.; Pourabdollah, A.; Jackson, M.
2011-12-01
Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the
Pegasus Workflow Management System: Helping Applications From Earth and Space
NASA Astrophysics Data System (ADS)
Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.
2010-12-01
Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved
SynTrack: DNA Assembly Workflow Management (SynTrack) v2.0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
MENG, XIANWEI; SIMIRENKO, LISA
2016-12-01
SynTrack is a dynamic, workflow-driven data management system that tracks the DNA build process: Management of the hierarchical relationships of the DNA fragments; Monitoring of process tasks for the assembly of multiple DNA fragments into final constructs; Creations of vendor order forms with selectable building blocks. Organizing plate layouts barcodes for vendor/pcr/fusion/chewback/bioassay/glycerol/master plate maps (default/condensed); Creating or updating Pre-Assembly/Assembly process workflows with selected building blocks; Generating Echo pooling instructions based on plate maps; Tracking of building block orders, received and final assembled for delivering; Bulk updating of colony or PCR amplification information, fusion PCR and chewback results; Updating with QA/QCmore » outcome with .csv & .xlsx template files; Re-work assembly workflow enabled before and after sequencing validation; and Tracking of plate/well data changes and status updates and reporting of master plate status with QC outcomes.« less
Radiology information system: a workflow-based approach.
Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P
2009-09-01
Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.
Workflow management in large distributed systems
NASA Astrophysics Data System (ADS)
Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.
2011-12-01
The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.
Big data analytics workflow management for eScience
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni
2015-04-01
In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the
Jflow: a workflow management system for web applications.
Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe
2016-02-01
Biologists produce large data sets and are in demand of rich and simple web portals in which they can upload and analyze their files. Providing such tools requires to mask the complexity induced by the needed High Performance Computing (HPC) environment. The connection between interface and computing infrastructure is usually specific to each portal. With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows. Jflow is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/jflow. The package is coming with full documentation, quick start and a running test portal. Jerome.Mariette@toulouse.inra.fr. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Decentralized Decision Making Toward Educational Goals.
ERIC Educational Resources Information Center
Monahan, William W.; Johnson, Homer M.
This monograph provides guidelines to help those school districts considering a more decentralized form of management. The authors discuss the levels at which different types of decisions should be made, describe the changing nature of the educational environment, identify different centralization-decentralization models, and suggest a flexible…
Generic worklist handler for workflow-enabled products
NASA Astrophysics Data System (ADS)
Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas
1999-07-01
Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.
Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M
2016-01-01
Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.
Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara
2017-01-01
Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors
Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network
NASA Astrophysics Data System (ADS)
Jones, A. S.; Horsburgh, J. S.
2014-12-01
Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban
2017-09-01
and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT...KEY MANAGEMENT FOR SECURE NEIGHBOR DISCOVERY IN A DECENTRALIZED WIRELESS SENSOR NETWORK by Kelvin T. Chew September 2017 Thesis Advisor...DATE September 2017 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE SYMMETRIC LINK KEY MANAGEMENT FOR SECURE NEIGHBOR
Effectiveness of a decentralized stormwater management program in the reduction of runoff volume
A decentralized, retrofit approach to storm water management was implemented in a small suburban drainage on the basis of a voluntary reverse auction. This effort led to the installation of 83 rain gardens and 176 rain barrels on approximately 20 percent of 350 residential proper...
Providing leadership to a decentralized total quality process.
Diederich, J J; Eisenberg, M
1993-01-01
Integrating total quality management into the culture of an organization and the daily work of employees requires a decentralized leadership structure that encourages all employees to become involved. This article, based upon the experience of the University of Michigan Hospitals Professional Services Divisional Lead Team, outlines a process for decentralizing the total quality management process.
Decentralized Budgeting in Education: Model Variations and Practitioner Perspectives.
ERIC Educational Resources Information Center
Hall, George; Metsinger, Jackie; McGinnis, Patricia
In educational settings, decentralized budgeting refers to various fiscal practices that disperse budgeting responsibility away from central administration to the line education units. This distributed decision-making is common to several financial management models. Among the many financial management models that employ decentralized budgeting…
Walden, Anita; Nahm, Meredith; Barnett, M Edwina; Conde, Jose G; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E; Eisenstein, Eric L
2011-01-01
New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs.
Walden, Anita; Nahm, Meredith; Barnett, M. Edwina; Conde, Jose G.; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E.; Eisenstein, Eric L.
2012-01-01
Background New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. Methods We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. Main Outcome Measures The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Results Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Conclusion Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs. PMID:21335692
Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.
Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan
2015-01-01
Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450
Saide, M A; Stewart, D E
2001-01-01
Despite political, cultural and geographical diversity, health care reforms implemented in many developing countries share a number of common features regarding management and structural issues. Decentralization of decision-making from the central authority to local and provincial levels is generally regarded in the literature to be an important way of achieving a more equitable distribution of health care and better management practices, aligned with local priorities and needs. However, in the absence of clear guidelines, continuous monitoring and an adequate supply of financial and human resources, decentralization processes are more likely to have a low impact on the process of health care reform and can, to a certain extent, provoke inequalities between regions in the same country. This qualitative study in Nampula province, Mozambique, was conducted to assess the impact of decentralization, through an analysis of the viewpoints of provincial health managers regarding their perceptions of the process, particularly with regard to the management of basic and elementary nurses. Secondary data from Nampula provincial reports and documents from the Mozambican Health Ministry were also reviewed and comparisons made with the experiences of other developing countries.
What supervisors want to know about decentralization.
Boissoneau, R; Belton, P
1991-06-01
Many organizations in various industries have tended to move away from strict centralization, yet some centralization is still vital to top management. With 19 of the 22 executives interviewed favoring or implementing some form of decentralization, it is probable that traditionally centralized organizations will follow the trend and begin to decentralize their organizational structures. The incentives and advantages of decentralization are too attractive to ignore. Decentralization provides responsibility, clear objectives, accountability for results, and more efficient and effective decision making. However, one must remember that decentralization can be overextended and that centralization is still viable in certain functions. Finding the correct balance between control and autonomy is a key to decentralization. Too much control and too much autonomy are the primary reasons for decentralization failures. In today's changing, competitive environment, structures must be continuously redefined, with the goal of finding an optimal balance between centralization and decentralization. Organizations are cautioned not to seek out and install a single philosopher-king to impose unified direction, but to unify leadership goals, participation, style, and control to develop improved methods of making all responsible leaders of one mind about the organization's needs and goals.
Schlesinger, Joseph J; Burdick, Kendall; Baum, Sarah; Bellomy, Melissa; Mueller, Dorothee; MacDonald, Alistair; Chern, Alex; Chrouser, Kristin; Burger, Christie
2018-03-01
The concept of clinical workflow borrows from management and leadership principles outside of medicine. The only way to rethink clinical workflow is to understand the neuroscience principles that underlie attention and vigilance. With any implementation to improve practice, there are human factors that can promote or impede progress. Modulating the environment and working as a team to take care of patients is paramount. Clinicians must continually rethink clinical workflow, evaluate progress, and understand that other industries have something to offer. Then, novel approaches can be implemented to take the best care of patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Climate Data Analytics Workflow Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.
2016-12-01
In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.
Fujiwara, T
2012-01-01
Unlike in urban areas where intensive water reclamation systems are available, development of decentralized technologies and systems is required for water use to be sustainable in agricultural areas. To overcome various water quality issues in those areas, a research project entitled 'Development of an innovative water management system with decentralized water reclamation and cascading material-cycle for agricultural areas under the consideration of climate change' was launched in 2009. This paper introduces the concept of this research and provides detailed information on each of its research areas: (1) development of a diffuse agricultural pollution control technology using catch crops; (2) development of a decentralized differentiable treatment system for livestock and human excreta; and (3) development of a cascading material-cycle system for water pollution control and value-added production. The author also emphasizes that the innovative water management system for agricultural areas should incorporate a strategy for the voluntary collection of bio-resources.
NASA Astrophysics Data System (ADS)
Anghileri, D.; Giuliani, M.; Castelletti, A.
2012-04-01
There is a general agreement that one of the most challenging issues related to water system management is the presence of many and often conflicting interests as well as the presence of several and independent decision makers. The traditional approach to multi-objective water systems management is a centralized management, in which an ideal central regulator coordinates the operation of the whole system, exploiting all the available information and balancing all the operating objectives. Although this approach allows to obtain Pareto-optimal solutions representing the maximum achievable benefit, it is based on assumptions which strongly limits its application in real world contexts: 1) top-down management, 2) existence of a central regulation institution, 3) complete information exchange within the system, 4) perfect economic efficiency. A bottom-up decentralized approach seems therefore to be more suitable for real case applications since different reservoir operators may maintain their independence. In this work we tested the consequences of a change in the water management approach moving from a centralized toward a decentralized one. In particular we compared three different cases: the centralized management approach, the independent management approach where each reservoir operator takes the daily release decision maximizing (or minimizing) his operating objective independently from each other, and an intermediate approach, leading to the Nash equilibrium of the associated game, where different reservoir operators try to model the behaviours of the other operators. The three approaches are demonstrated using a test case-study composed of two reservoirs regulated for the minimization of flooding in different locations. The operating policies are computed by solving one single multi-objective optimal control problem, in the centralized management approach; multiple single-objective optimization problems, i.e. one for each operator, in the independent case
Atkinson, Sarah; Haran, Dave
2004-01-01
OBJECTIVE: To examine whether decentralization has improved health system performance in the State of Ceara, north-east Brazil. METHODS: Ceara is strongly committed to decentralization. A survey across 45 local (municipio) health systems collected data on performance and formal organization, including decentralization, informal management and local political culture. The indicators for informal management and local political culture were based on prior ethnographic research. Data were analysed using analysis of variance, Duncan's post-hoc test and multiple regression. FINDINGS: Decentralization was associated with improved performance, but only for 5 of our 22 performance indicators. Moreover, in the multiple regression, decentralization explained the variance in only one performance indicator; indicators for informal management and political culture appeared to be more important influences. However, some indicators for informal management were themselves associated with decentralization but not any of the political culture indicators. CONCLUSION: Good management practices in the study led to decentralized local health systems rather than vice versa. Any apparent association between decentralization and performance seems to be an artefact of the informal management, and the wider political culture in which a local health system is embedded strongly influences the performance of local health systems. PMID:15640917
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
Game-Based Virtual Worlds as Decentralized Virtual Activity Systems
NASA Astrophysics Data System (ADS)
Scacchi, Walt
There is widespread interest in the development and use of decentralized systems and virtual world environments as possible new places for engaging in collaborative work activities. Similarly, there is widespread interest in stimulating new technological innovations that enable people to come together through social networking, file/media sharing, and networked multi-player computer game play. A decentralized virtual activity system (DVAS) is a networked computer supported work/play system whose elements and social activities can be both virtual and decentralized (Scacchi et al. 2008b). Massively multi-player online games (MMOGs) such as World of Warcraft and online virtual worlds such as Second Life are each popular examples of a DVAS. Furthermore, these systems are beginning to be used for research, deve-lopment, and education activities in different science, technology, and engineering domains (Bainbridge 2007, Bohannon et al. 2009; Rieber 2005; Scacchi and Adams 2007; Shaffer 2006), which are also of interest here. This chapter explores two case studies of DVASs developed at the University of California at Irvine that employ game-based virtual worlds to support collaborative work/play activities in different settings. The settings include those that model and simulate practical or imaginative physical worlds in different domains of science, technology, or engineering through alternative virtual worlds where players/workers engage in different kinds of quests or quest-like workflows (Jakobsson 2006).
Strategies of Educational Decentralization: Key Questions and Core Issues.
ERIC Educational Resources Information Center
Hanson, E. Mark
1998-01-01
Explains key issues and forces that shape organization and management strategies of educational decentralization, using examples from Colombia, Venezuela, Argentina, Nicaragua, and Spain. Core decentralization issues include national and regional goals, planning, political stress, resource distribution, infrastructure development, and job…
Autonomic Management of Application Workflows on Hybrid Computing Infrastructure
Kim, Hyunjoo; el-Khamra, Yaakoub; Rodero, Ivan; ...
2011-01-01
In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints.more » The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.« less
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
How to Take HRMS Process Management to the Next Level with Workflow Business Event System
NASA Technical Reports Server (NTRS)
Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna
2006-01-01
Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.
The impact of missing sensor information on surgical workflow management.
Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas
2013-09-01
Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.
The future of scientific workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Peterka, Tom; Altintas, Ilkay
Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less
Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization
Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley
2015-01-01
Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173
Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás
2017-01-01
Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932
Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás
2017-01-01
Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.
Biowep: a workflow enactment portal for bioinformatics applications.
Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano
2007-03-08
The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of
Biowep: a workflow enactment portal for bioinformatics applications
Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano
2007-01-01
Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis
Expansion of a residency program through provision of second-shift decentralized services.
Host, Brian D; Anderson, Michael J; Lucas, Paul D
2014-12-15
The rationale for and logistics of the expansion of a postgraduate year 1 (PGY1) residency program in a community hospital are described. Baptist Health Lexington, a nonprofit community hospital in Lexington, Kentucky, sought to expand the PGY1 program by having residents perform second-shift decentralized pharmacist functions. Program expansion was predicated on aligning resident staffing functions with current hospitalwide initiatives involving medication reconciliation and patient education. The focus was to integrate residents into the workflow while allowing them more time to practice as pharmacists and contribute to departmental objectives. The staffing function would increase residents' overall knowledge of departmental operations and foster their sense of independence and ownership. The decentralized functions would include initiation of clinical pharmacokinetic consultations, admission medication reconciliation, discharge teaching for patients with heart failure, and order-entry support from decentralized locations. The program grew from three to five residents and established a staffing rotation for second-shift decentralized coverage. The increased time spent staffing did not detract from the time allotted to previously established learning experiences and enhanced overall continuity of the staffing experience. The change also emphasized to the residents the importance of integration of distributive and clinical functions within the department. Pharmacist participation in admission and discharge medication reconciliation activities has also increased patient satisfaction, evidenced by follow-up surveys conducted by the hospital. A PGY1 residency program was expanded through the provision of second-shift decentralized clinical services, which helped provide residents with increased patient exposure and enhanced staffing experience. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Mutemwa, Richard I
2006-01-01
At the onset of health system decentralization as a primary health care strategy, which constituted a key feature of health sector reforms across the developing world, efficient and effective health management information systems (HMIS) were widely acknowledged and adopted as a critical element of district health management strengthening programmes. The focal concern was about the performance and long-term sustainability of decentralized district health systems. The underlying logic was that effective and efficient HMIS would provide district health managers with the information required to make effective strategic decisions that are the vehicle for district performance and sustainability in these decentralized health systems. However, this argument is rooted in normative management and decision theory without significant unequivocal empirical corroboration. Indeed, extensive empirical evidence continues to indicate that managers' decision-making behaviour and the existence of other forms of information outside the HMIS, within the organizational environment, suggest a far more tenuous relationship between the presence of organizational management information systems (such as HMIS) and effective strategic decision-making. This qualitative comparative case-study conducted in two districts of Zambia focused on investigating the presence and behaviour of five formally identified, different information forms, including that from HMIS, in the strategic decision-making process. The aim was to determine the validity of current arguments for HMIS, and establish implications for current HMIS policies. Evidence from the eight strategic decision-making processes traced in the study confirmed the existence of different forms of information in the organizational environment, including that provided by the conventional HMIS. These information forms attach themselves to various organizational management processes and key aspects of organizational routine. The study results point
Bioinformatics workflows and web services in systems biology made easy for experimentalists.
Jimenez, Rafael C; Corpas, Manuel
2013-01-01
Workflows are useful to perform data analysis and integration in systems biology. Workflow management systems can help users create workflows without any previous knowledge in programming and web services. However the computational skills required to build such workflows are usually above the level most biological experimentalists are comfortable with. In this chapter we introduce workflow management systems that reuse existing workflows instead of creating them, making it easier for experimentalists to perform computational tasks.
The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi
The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less
The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC
Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi
2018-03-19
The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less
NASA Astrophysics Data System (ADS)
Solomon, Clement
According to the United States Environmental Protections Agency (USEPA), nearly one in four households in the United States depends on an individual septic system (commonly referred as an onsite system or a decentralized wastewater system) to treat and disperse wastewater. More than half of these systems are over 30 years old, and surveys indicate at least 10 to 20% might not be functioning properly. The USEPA concluded in its 1997 report to Congress that adequately managed decentralized wastewater systems (DWS) are a cost-effective and long-term option for meeting public health and water quality goals, particularly in less densely populated areas. The major challenge however is the absence of a guiding national regulatory framework based on consistent performance-based standards and lack of proper management of DWS. These inconsistencies pose a significant threat to our water resources, local economies, and public health. This dissertation addresses key policy and regulatory strategies needed in response to the new realities confronting decentralized wastewater management. The two core objectives of this research are to demonstrate the centralized management of DWS paradigm and to present a scientific methodology to develop performance-based standards (a regulatory shift from prescriptive methods) using remote monitoring. The underlying remote monitoring architecture for centralized DWS management and the value of science-based policy making are presented. Traditionally, prescriptive standards using conventional grab sampling data are the norm by which most standards are set. Three case studies that support the potential of remote monitoring as a tool for standards development and system management are presented. The results revealed a vital role for remote monitoring in the development of standardized protocols, policies and procedures that are greatly lacking in this field. This centralized management and remote monitoring paradigm fits well and complements
Decentralized Real-Time Scheduling
1990-08-01
must provide several alternative resource management policies, including FIFO and deadline queueing for shared resources that are not available. 5...When demand exceeds the supply of shared resources (even within a single switch), some calls cannot be completed. In that case, a call’s priority...associated chiefly with the need to manage resources in a timely and decentralized fashion. The Alpha programming model permits the convenient expression of
NASA Astrophysics Data System (ADS)
Pan, Tianheng
2018-01-01
In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.
[Analysis of the healthcare service decentralization process in Côte d'Ivoire].
Soura, B D; Coulibaly, S S
2014-01-01
The decentralization of healthcare services is becoming increasingly important in strategies of public sector management. This concept is analyzed from various points of view, including legal, economic, political, and sociological. Several typologies have been proposed in the literature to analyze this decentralization process, which can take different forms ranging from simple deconcentration to more elaborate devolution. In some instances, decentralization can be analyzed by the degree of autonomy given to local authorities. This article applies these typologies to analyze the healthcare system decentralization process in Cote d'Ivoire. Special attention is paid to the new forms of community healthcare organizations. These decentralized structures enjoy a kind of autonomy, with characteristics closer to those of devolution. The model might serve as an example for population involvement in defining and managing healthcare problems in Cote d'Ivoire. We end with proposals for the improvement of the process.
Papers by the Decentralized Wastewater Management MOU Partnership
Four position papers for state, local, and tribal government officials and interested stakeholders. These papers include information on the uses and benefits of decentralized wastewater treatment and examples of its effective use.
A microseismic workflow for managing induced seismicity risk as CO 2 storage projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzel, E.; Morency, C.; Pyle, M.
2015-10-27
It is well established that fluid injection has the potential to induce earthquakes—from microseismicity to large, damaging events—by altering state-of-stress conditions in the subsurface. While induced seismicity has not been a major operational issue for carbon storage projects to date, a seismicity hazard exists and must be carefully addressed. Two essential components of effective seismic risk management are (1) sensitive microseismic monitoring and (2) robust data interpretation tools. This report describes a novel workflow, based on advanced processing algorithms applied to microseismic data, to help improve management of seismic risk. This workflow has three main goals: (1) to improve themore » resolution and reliability of passive seismic monitoring, (2) to extract additional, valuable information from continuous waveform data that is often ignored in standard processing, and (3) to minimize the turn-around time between data collection, interpretation, and decision-making. These three objectives can allow for a better-informed and rapid response to changing subsurface conditions.« less
Modelling and analysis of workflow for lean supply chains
NASA Astrophysics Data System (ADS)
Ma, Jinping; Wang, Kanliang; Xu, Lida
2011-11-01
Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.
Contreras, Iván; Kiefer, Stephan; Vehi, Josep
2017-01-01
Diabetes self-management is a crucial element for all people with diabetes and those at risk for developing the disease. Diabetic patients should be empowered to increase their self-management skills in order to prevent or delay the complications of diabetes. This work presents the proposal and first development stages of a smartphone application focused on the empowerment of the patients with diabetes. The concept of this interventional tool is based on the personalization of the user experience from an adaptive and dynamic perspective. The segmentation of the population and the dynamical treatment of user profiles among the different experience levels is the main challenge of the implementation. The self-management assistant and remote treatment for diabetes aims to develop a platform to integrate a series of innovative models and tools rigorously tested and supported by the research literature in diabetes together the use of a proved engine to manage workflows for healthcare.
Akdemir, Nesibe; Lombarts, Kiki M J M H; Paternotte, Emma; Schreuder, Bas; Scheele, Fedde
2017-06-02
Evaluating the quality of postgraduate medical education (PGME) programs through accreditation is common practice worldwide. Accreditation is shaped by educational quality and quality management. An appropriate accreditation design is important, as it may drive improvements in training. Moreover, accreditors determine whether a PGME program passes the assessment, which may have major consequences, such as starting, continuing or discontinuing PGME. However, there is limited evidence for the benefits of different choices in accreditation design. Therefore, this study aims to explain how changing views on educational quality and quality management have impacted the design of the PGME accreditation system in the Netherlands. To determine the historical development of the Dutch PGME accreditation system, we conducted a document analysis of accreditation documents spanning the past 50 years and a vision document outlining the future system. A template analysis technique was used to identify the main elements of the system. Four themes in the Dutch PGME accreditation system were identified: (1) objectives of accreditation, (2) PGME quality domains, (3) quality management approaches and (4) actors' responsibilities. Major shifts have taken place regarding decentralization, residency performance and physician practice outcomes, and quality improvement. Decentralization of the responsibilities of the accreditor was absent in 1966, but this has been slowly changing since 1999. In the future system, there will be nearly a maximum degree of decentralization. A focus on outcomes and quality improvement has been introduced in the current system. The number of formal documents striving for quality assurance has increased enormously over the past 50 years, which has led to increased bureaucracy. The future system needs to decrease the number of standards to focus on measurable outcomes and to strive for quality improvement. The challenge for accreditors is to find the right
Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.
Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A
2005-04-07
Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.
Although urban stormwater is typically conveyed to centralized infrastructure, there is great potential for reducing stormwater runoff quantity through decentralization. In this case we hypothesize that smaller-scale retrofit best management practices (BMPs) such as rain gardens ...
Wiggers, Anne-Marieke; Vosbergen, Sandra; Kraaijenhagen, Roderik; Jaspers, Monique; Peek, Niels
2013-01-01
E-health interventions are of a growing importance for self-management of chronic conditions. This study aimed to describe the process adaptions that are needed in cardiac rehabilitation (CR) to implement a self-management system, called MyCARDSS. We created a generic workflow model based on interviews and observations at three CR clinics. Subsequently, a workflow model of the ideal situation after implementation of MyCARDSS was created. We found that the implementation will increase the complexity of existing working procedures because 1) not all patients will use MyCARDSS, 2) there is a transfer of tasks and responsibilities from professionals to patients, and 3) information in MyCARDSS needs to be synchronized with the EPR system for professionals.
Design and implementation of workflow engine for service-oriented architecture
NASA Astrophysics Data System (ADS)
Peng, Shuqing; Duan, Huining; Chen, Deyun
2009-04-01
As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.
The recent process of decentralization and democratic management of education in Brazil
NASA Astrophysics Data System (ADS)
Santos Filho, José Camilo Dos
1993-09-01
Brazilian society is beginning a new historical period in which the principle of decentralization is beginning to predominate over centralization, which held sway during the last 25 years. In contrast to recent Brazilian history, there is now a search for political, democratic and participatory decentralization more consonant with grass-roots aspirations. The first section of this article presents a brief analysis of some decentralization policies implemented by the military regime of 1964, and discusses relevant facts related to the resistance of civil society to state authoritarianism, and to the struggle for the democratization and organization of civil society up to the end of the 1970s. The second section analyzes some new experiences of democratic public school administration initiated in the 1970s and 1980s. The final section discusses the move toward decentralization and democratization of public school administration in the new Federal and State Constitutions, and in the draft of the new Law of National Education.
ERIC Educational Resources Information Center
Fuentes, Steven
2017-01-01
Usability heuristics have been established for different uses and applications as general guidelines for user interfaces. These can affect the implementation of industry solutions and play a significant role regarding cost reduction and process efficiency. The area of electronic workflow document management (EWDM) solutions, also known as…
Jung, Youngmee Tiffany; Narayanan, N C; Cheng, Yu-Ling
2018-05-01
There is a growing interest in decentralized wastewater management (DWWM) as a potential alternative to centralized wastewater management (CWWM) in developing countries. However, the comparative cost of CWWM and DWWM is not well understood. In this study, the cost of cluster-type DWWM is simulated and compared to the cost of CWWM in Alibag, India. A three-step model is built to simulate a broad range of potential DWWM configurations with varying number and layout of cluster subsystems. The considered DWWM scheme consists of cluster subsystems, that each uses simplified sewer and DEWATS (Decentralized Wastewater Treatment Systems). We consider CWWM that uses conventional sewer and an activated sludge plant. The results show that the cost of DWWM can vary significantly with the number and layout of the comprising cluster subsystems. The cost of DWWM increased nonlinearly with increasing number of comprising clusters, mainly due to the loss in the economies of scale for DEWATS. For configurations with the same number of comprising cluster subsystems, the cost of DWWM varied by ±5% around the mean, depending on the layout of the cluster subsystems. In comparison to CWWM, DWWM was of lower cost than CWWM when configured with fewer than 16 clusters in Alibag, with significantly less operation and maintenance requirement, but with higher capital and land requirement for construction. The study demonstrates that cluster-type DWWM using simplified sewer and DEWATS may be a cost-competitive alternative to CWWM, when carefully configured to lower the cost. Copyright © 2018 Elsevier Ltd. All rights reserved.
Parker, Pete; Thapa, Brijesh; Jacob, Aerin
2015-12-01
To alleviate poverty and enhance conservation in resource dependent communities, managers must identify existing livelihood strategies and the associated factors that impede household access to livelihood assets. Researchers increasingly advocate reallocating management power from exclusionary central institutions to a decentralized system of management based on local and inclusive participation. However, it is yet to be shown if decentralizing conservation leads to diversified livelihoods within a protected area. The purpose of this study was to identify and assess factors affecting household livelihood diversification within Nepal's Kanchenjunga Conservation Area Project, the first protected area in Asia to decentralize conservation. We randomly surveyed 25% of Kanchenjunga households to assess household socioeconomic and demographic characteristics and access to livelihood assets. We used a cluster analysis with the ten most common income generating activities (both on- and off-farm) to group the strategies households use to diversify livelihoods, and a multinomial logistic regression to identify predictors of livelihood diversification. We found four distinct groups of household livelihood strategies with a range of diversification that directly corresponded to household income. The predictors of livelihood diversification were more related to pre-existing socioeconomic and demographic factors (e.g., more landholdings and livestock, fewer dependents, receiving remittances) than activities sponsored by decentralizing conservation (e.g., microcredit, training, education, interaction with project staff). Taken together, our findings indicate that without direct policies to target marginalized groups, decentralized conservation in Kanchenjunga will continue to exclude marginalized groups, limiting a household's ability to diversify their livelihood and perpetuating their dependence on natural resources. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tomlin, M. C.; Jenkyns, R.
2015-12-01
Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.
ERIC Educational Resources Information Center
Arunatilake, Nisha; Jayawardena, Priyanka
2010-01-01
Using the experience of the Educational Quality Inputs (EQI) Scheme in Sri Lanka the paper examines the distributional aspects of formula-based funding and efficiency of decentralized management of education funds in a developing country setting. The study finds that the EQI fund distribution is largely pro-poor. However, results show that to…
wft4galaxy: a workflow testing tool for galaxy.
Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi
2017-12-01
Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.
Multitarget-multisensor management for decentralized sensor networks
NASA Astrophysics Data System (ADS)
Tharmarasa, R.; Kirubarajan, T.; Sinha, A.; Hernandez, M. L.
2006-05-01
In this paper, we consider the problem of sensor resource management in decentralized tracking systems. Due to the availability of cheap sensors, it is possible to use a large number of sensors and a few fusion centers (FCs) to monitor a large surveillance region. Even though a large number of sensors are available, due to frequency, power and other physical limitations, only a few of them can be active at any one time. The problem is then to select sensor subsets that should be used by each FC at each sampling time in order to optimize the tracking performance subject to their operational constraints. In a recent paper, we proposed an algorithm to handle the above issues for joint detection and tracking, without using simplistic clustering techniques that are standard in the literature. However, in that paper, a hierarchical architecture with feedback at every sampling time was considered, and the sensor management was performed only at a central fusion center (CFC). However, in general, it is not possible to communicate with the CFC at every sampling time, and in many cases there may not even be a CFC. Sometimes, communication between CFC and local fusion centers might fail as well. Therefore performing sensor management only at the CFC is not viable in most networks. In this paper, we consider an architecture in which there is no CFC, each FC communicates only with the neighboring FCs, and communications are restricted. In this case, each FC has to decide which sensors are to be used by itself at each measurement time step. We propose an efficient algorithm to handle the above problem in real time. Simulation results illustrating the performance of the proposed algorithm are also presented.
Workflow Automation: A Collective Case Study
ERIC Educational Resources Information Center
Harlan, Jennifer
2013-01-01
Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…
This research tests a novel method that focuses limited community resources on a decentralized approach to storm water management. A reverse auction was used to relieve legal constraints on management implementation on private land. Residents voluntarily bid on rain gardens and r...
VisTrails SAHM: visualization and workflow management for species habitat modeling
Morisette, Jeffrey T.; Jarnevich, Catherine S.; Holcombe, Tracy R.; Talbert, Colin B.; Ignizio, Drew A.; Talbert, Marian; Silva, Claudio; Koop, David; Swanson, Alan; Young, Nicholas E.
2013-01-01
The Software for Assisted Habitat Modeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre- and post-processing steps and modeling options incorporated in the construction of a species distribution model through the established workflow management and visualization VisTrails software. This paper provides an overview of the VisTrails:SAHM software including a link to the open source code, a table detailing the current SAHM modules, and a simple example modeling an invasive weed species in Rocky Mountain National Park, USA.
Human resources for health and decentralization policy in the Brazilian health system
2011-01-01
Background The Brazilian health reform process, following the establishment of the Unified Health System (SUS), has had a strong emphasis on decentralization, with a special focus on financing, management and inter-managerial agreements. Brazil is a federal country and the Ministry of Health (MoH), through the Secretary of Labour Management and Health Education, is responsible for establishing national policy guidelines for health labour management, and also for implementing strategies for the decentralization of management of labour and education in the federal states. This paper assesses whether the process of decentralizing human resources for health (HRH) management and organization to the level of the state and municipal health departments has involved investments in technical, political and financial resources at the national level. Methods The research methods used comprise a survey of HRH managers of states and major municipalities (including capitals) and focus groups with these HRH managers - all by geographic region. The results were obtained by combining survey and focus group data, and also through triangulation with the results of previous research. Results The results of this evaluation showed the evolution policy, previously restricted to the field of 'personnel administration', now expanded to a conceptual model for health labour management and education-- identifying progress, setbacks, critical issues and challenges for the consolidation of the decentralized model for HRH management. The results showed that 76.3% of the health departments have an HRH unit. It was observed that 63.2% have an HRH information system. However, in most health departments, the HRH unit uses only the payroll and administrative records as data sources. Concerning education in health, 67.6% of the HRH managers mentioned existing cooperation with educational and teaching institutions for training and/or specialization of health workers. Among them, specialization courses
Human resources for health and decentralization policy in the Brazilian health system.
Pierantoni, Celia Regina; Garcia, Ana Claudia P
2011-05-17
The Brazilian health reform process, following the establishment of the Unified Health System (SUS), has had a strong emphasis on decentralization, with a special focus on financing, management and inter-managerial agreements. Brazil is a federal country and the Ministry of Health (MoH), through the Secretary of Labour Management and Health Education, is responsible for establishing national policy guidelines for health labour management, and also for implementing strategies for the decentralization of management of labour and education in the federal states. This paper assesses whether the process of decentralizing human resources for health (HRH) management and organization to the level of the state and municipal health departments has involved investments in technical, political and financial resources at the national level. The research methods used comprise a survey of HRH managers of states and major municipalities (including capitals) and focus groups with these HRH managers - all by geographic region. The results were obtained by combining survey and focus group data, and also through triangulation with the results of previous research. The results of this evaluation showed the evolution policy, previously restricted to the field of 'personnel administration', now expanded to a conceptual model for health labour management and education-- identifying progress, setbacks, critical issues and challenges for the consolidation of the decentralized model for HRH management. The results showed that 76.3% of the health departments have an HRH unit. It was observed that 63.2% have an HRH information system. However, in most health departments, the HRH unit uses only the payroll and administrative records as data sources. Concerning education in health, 67.6% of the HRH managers mentioned existing cooperation with educational and teaching institutions for training and/or specialization of health workers. Among them, specialization courses account for 61.4% and short
Amoussouhoui, Arnaud Setondji; Sopoh, Ghislain Emmanuel; Wadagni, Anita Carolle; Johnson, Roch Christian; Aoulou, Paulin; Agbo, Inès Elvire; Houezo, Jean-Gabin; Boyer, Micah; Nichter, Mark
2018-03-01
Mycobacterium ulcerans infection, commonly known as Buruli ulcer (BU), is a debilitating neglected tropical disease. Its management remains complex and has three main components: antibiotic treatment combining rifampicin and streptomycin for 56 days, wound dressings and skin grafts for large ulcerations, and physical therapy to prevent functional limitations after care. In Benin, BU patient care is being integrated into the government health system. In this paper, we report on an innovative pilot program designed to introduce BU decentralization in Ouinhi district, one of Benin's most endemic districts previously served by centralized hospital-based care. We conducted intervention-oriented research implemented in four steps: baseline study, training of health district clinical staff, outreach education, outcome and impact assessments. Study results demonstrated that early BU lesions (71% of all detected cases) could be treated in the community following outreach education, and that most of the afflicted were willing to accept decentralized treatment. Ninety-three percent were successfully treated with antibiotics alone. The impact evaluation found that community confidence in decentralized BU care was greatly enhanced by clinic staff who came to be seen as having expertise in the care of most chronic wounds. This study documents a successful BU outreach and decentralized care program reaching early BU cases not previously treated by a proactive centralized BU program. The pilot program further demonstrates the added value of integrated wound management for NTD control.
Flexible Early Warning Systems with Workflows and Decision Tables
NASA Astrophysics Data System (ADS)
Riedel, F.; Chaves, F.; Zeiner, H.
2012-04-01
An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows
Decentralized Energy Management System for Networked Microgrids in Grid-connected and Islanded Modes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhaoyu; Chen, Bokan; Wang, Jianhui
This paper proposes a decentralized energy management system (EMS) for the coordinated operation of networked Microgirds (MGs) in a distribution system. In the grid-connected mode, the distribution network operator (DNO) and each MG are considered as distinct entities with individual objectives to minimize their own operation costs. It is assumed that both dispatchable and renewable energy source (RES)-based distributed generators (DGs) exist in the distribution network and the networked MGs. In order to coordinate the operation of all entities, we apply a decentralized bi-level algorithm to solve the problem with the first level to conduct negotiations among all entities andmore » the second level to update the non-converging penalties. In the islanded mode, the objective of each MG is to maintain a reliable power supply to its customers. In order to take into account the uncertainties of DG outputs and load consumption, we formulate the problems as two-stage stochastic programs. The first stage is to determine base generation setpoints based on the forecasts and the second stage is to adjust the generation outputs based on the realized scenarios. Case studies of a distribution system with networked MGs demonstrate the effectiveness of the proposed methodology in both grid-connected and islanded modes.« less
RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service
NASA Astrophysics Data System (ADS)
Yang, Chao; Chen, Nengcheng; Di, Liping
2012-10-01
Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
COSMOS: Python library for massively parallel workflows
Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.
2014-01-01
Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428
COSMOS: Python library for massively parallel workflows.
Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J
2014-10-15
Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Amoussouhoui, Arnaud Setondji; Wadagni, Anita Carolle; Johnson, Roch Christian; Aoulou, Paulin; Agbo, Inès Elvire; Houezo, Jean-Gabin; Boyer, Micah; Nichter, Mark
2018-01-01
Background Mycobacterium ulcerans infection, commonly known as Buruli ulcer (BU), is a debilitating neglected tropical disease. Its management remains complex and has three main components: antibiotic treatment combining rifampicin and streptomycin for 56 days, wound dressings and skin grafts for large ulcerations, and physical therapy to prevent functional limitations after care. In Benin, BU patient care is being integrated into the government health system. In this paper, we report on an innovative pilot program designed to introduce BU decentralization in Ouinhi district, one of Benin’s most endemic districts previously served by centralized hospital-based care. Methodology/Principal findings We conducted intervention-oriented research implemented in four steps: baseline study, training of health district clinical staff, outreach education, outcome and impact assessments. Study results demonstrated that early BU lesions (71% of all detected cases) could be treated in the community following outreach education, and that most of the afflicted were willing to accept decentralized treatment. Ninety-three percent were successfully treated with antibiotics alone. The impact evaluation found that community confidence in decentralized BU care was greatly enhanced by clinic staff who came to be seen as having expertise in the care of most chronic wounds. Conclusions/Significance This study documents a successful BU outreach and decentralized care program reaching early BU cases not previously treated by a proactive centralized BU program. The pilot program further demonstrates the added value of integrated wound management for NTD control. PMID:29529087
Design and implementation of a secure workflow system based on PKI/PMI
NASA Astrophysics Data System (ADS)
Yan, Kai; Jiang, Chao-hui
2013-03-01
As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.
Decentralized Quasi-Newton Methods
NASA Astrophysics Data System (ADS)
Eisen, Mark; Mokhtari, Aryan; Ribeiro, Alejandro
2017-05-01
We introduce the decentralized Broyden-Fletcher-Goldfarb-Shanno (D-BFGS) method as a variation of the BFGS quasi-Newton method for solving decentralized optimization problems. The D-BFGS method is of interest in problems that are not well conditioned, making first order decentralized methods ineffective, and in which second order information is not readily available, making second order decentralized methods impossible. D-BFGS is a fully distributed algorithm in which nodes approximate curvature information of themselves and their neighbors through the satisfaction of a secant condition. We additionally provide a formulation of the algorithm in asynchronous settings. Convergence of D-BFGS is established formally in both the synchronous and asynchronous settings and strong performance advantages relative to first order methods are shown numerically.
GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction
NASA Astrophysics Data System (ADS)
Chrysanthis, Panos K.
1996-12-01
of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity
Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms
Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel
2017-01-01
With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237
Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.
Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel
2014-01-01
With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.
A data management and publication workflow for a large-scale, heterogeneous sensor network.
Jones, Amber Spackman; Horsburgh, Jeffery S; Reeder, Stephanie L; Ramírez, Maurier; Caraballo, Juan
2015-06-01
It is common for hydrology researchers to collect data using in situ sensors at high frequencies, for extended durations, and with spatial distributions that produce data volumes requiring infrastructure for data storage, management, and sharing. The availability and utility of these data in addressing scientific questions related to water availability, water quality, and natural disasters relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into usable data products. It also depends on the ability of researchers to share and access the data in useable formats. In this paper, we describe a data management and publication workflow and software tools for research groups and sites conducting long-term monitoring using in situ sensors. Functionality includes the ability to track monitoring equipment inventory and events related to field maintenance. Linking this information to the observational data is imperative in ensuring the quality of sensor-based data products. We present these tools in the context of a case study for the innovative Urban Transitions and Aridregion Hydrosustainability (iUTAH) sensor network. The iUTAH monitoring network includes sensors at aquatic and terrestrial sites for continuous monitoring of common meteorological variables, snow accumulation and melt, soil moisture, surface water flow, and surface water quality. We present the overall workflow we have developed for effectively transferring data from field monitoring sites to ultimate end-users and describe the software tools we have deployed for storing, managing, and sharing the sensor data. These tools are all open source and available for others to use.
Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2014-12-01
The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber
Urban stormwater is typically conveyed to centralized infrastructure, and there is great potential for reducing stormwater runoff quantity through decentralization. In this case we hypothesize that smaller-scale retrofit best management practices (BMPs) such as rain gardens and r...
SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog
NASA Astrophysics Data System (ADS)
Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely
2014-05-01
pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.
Highways and Urban Decentralization
DOT National Transportation Integrated Search
1998-01-01
This report documents a retrospective study of the relationship between highways and urban decentralization. We see decentralization as caused largely by the increased consumption of land by residents and businesses which occurs mainly because of hig...
Indirect decentralized repetitive control
NASA Technical Reports Server (NTRS)
Lee, Soo Cheol; Longman, Richard W.
1993-01-01
Learning control refers to controllers that learn to improve their performance at executing a given task, based on experience performing this specific task. In a previous work, the authors presented a theory of indirect decentralized learning control based on use of indirect adaptive control concepts employing simultaneous identification and control. This paper extends these results to apply to the indirect repetitive control problem in which a periodic (i.e., repetitive) command is given to a control system. Decentralized indirect repetitive control algorithms are presented that have guaranteed convergence to zero tracking error under very general conditions. The original motivation of the repetitive control and learning control fields was learning in robots doing repetitive tasks such as on an assembly line. This paper starts with decentralized discrete time systems, and progresses to the robot application, modeling the robot as a time varying linear system in the neighborhood of the desired trajectory. Decentralized repetitive control is natural for this application because the feedback control for link rotations is normally implemented in a decentralized manner, treating each link as if it is independent of the other links.
Organizational decentralization in radiology.
Aas, I H Monrad
2006-01-01
At present, most hospitals have a department of radiology where images are captured and interpreted. Decentralization is the opposite of centralization and means 'away from the centre'. With a Picture Archiving and Communication System (PACS) and broadband communications, transmitting radiology images between sites will be far easier than before. Qualitative interviews of 26 resource persons were performed in Norway. There was a response rate of 90%. Decentralization of radiology interpretations seems less relevant than centralization, but several forms of decentralization have a role to play. The respondents mentioned several advantages, including exploitation of capacity and competence. They also mentioned several disadvantages, including splitting professional communities and reduced contact between radiologists and clinicians. With the new technology decentralization and centralization of image interpretation are important possibilities in organizational change. This will be important for the future of teleradiology.
Decentralized Online Social Networks
NASA Astrophysics Data System (ADS)
Datta, Anwitaman; Buchegger, Sonja; Vu, Le-Hung; Strufe, Thorsten; Rzadca, Krzysztof
Current Online social networks (OSN) are web services run on logically centralized infrastructure. Large OSN sites use content distribution networks and thus distribute some of the load by caching for performance reasons, nevertheless there is a central repository for user and application data. This centralized nature of OSNs has several drawbacks including scalability, privacy, dependence on a provider, need for being online for every transaction, and a lack of locality. There have thus been several efforts toward decentralizing OSNs while retaining the functionalities offered by centralized OSNs. A decentralized online social network (DOSN) is a distributed system for social networking with no or limited dependency on any dedicated central infrastructure. In this chapter we explore the various motivations of a decentralized approach to online social networking, discuss several concrete proposals and types of DOSN as well as challenges and opportunities associated with decentralization.
ERIC Educational Resources Information Center
Winardi
2017-01-01
Decentralization is acknowledged as the handover of government from central government to local government, including giving broader authority to local governments to manage education. This study aims to discovering education development gap between regions in Indonesia as a result of decentralization. This research method uses descriptive…
Leadership and the Decentralized Control of Schools
ERIC Educational Resources Information Center
Steinberg, Matthew P.
2013-01-01
This review examines the literature related to leadership and the decentralized control of schools. It first considers the distinctive goals of public and private agencies, the specific constraints that shape the autonomy of leaders in different sectors, and the ways in which new models of public management are infusing public agencies with…
Workflow technology: the new frontier. How to overcome the barriers and join the future.
Shefter, Susan M
2006-01-01
Hospitals are catching up to the business world in the introduction of technology systems that support professional practice and workflow. The field of case management is highly complex and interrelates with diverse groups in diverse locations. The last few years have seen the introduction of Workflow Technology Tools, which can improve the quality and efficiency of discharge planning by the case manager. Despite the availability of these wonderful new programs, many case managers are hesitant to adopt the new technology and workflow. For a myriad of reasons, a computer-based workflow system can seem like a brick wall. This article discusses, from a practitioner's point of view, how professionals can gain confidence and skill to get around the brick wall and join the future.
Contextual cloud-based service oriented architecture for clinical workflow.
Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos
2015-01-01
Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.
Centralization and Decentralization of Schools' Physical Facilities Management in Nigeria
ERIC Educational Resources Information Center
Ikoya, Peter O.
2008-01-01
Purpose: This research aims to examine the difference in the availability, adequacy and functionality of physical facilities in centralized and decentralized schools districts, with a view to making appropriate recommendations to stakeholders on the reform programmes in the Nigerian education sector. Design/methodology/approach: Principals,…
[Significant changes in the health system decentralization process in Brazil].
Viana, Ana Luiza d'Avila; Heimann, Luiza S; de Lima, Luciana Dias; de Oliveira, Roberta Gondim; Rodrigues, Sergio da Hora
2002-01-01
This article discusses the trends and limits of the Brazilian health system decentralization process, identifying the three elements that constitute the strategic induction performed by the national system administrator in accordance with the guidelines contained in the Operational Norms of the Unified National Health System: systemic rationality, intergovernmental and service provider financing, and health care model. The effects of the Federal regulations are analyzed based on the results of the evaluation study focused on the implementation of the full management scheme at the Municipal level. The decentralization strategy induced by Basic Operational Norm 96 has succeeded in improving institutional conditions, management autonomy, and supply, as measured by the Federal resources transferred, installed capacity, production, and coverage of outpatient and hospital services, with the Municipalities authorized to conduct fully autonomous management, without altering the existing patterns of inequity in the distribution of funds to poorer Municipalities.
Mobile task management tool that improves workflow of an acute general surgical service.
Foo, Elizabeth; McDonald, Rod; Savage, Earle; Floyd, Richard; Butler, Anthony; Rumball-Smith, Alistair; Connor, Saxon
2015-10-01
Understanding and being able to measure constraints within a health system is crucial if outcomes are to be improved. Current systems lack the ability to capture decision making with regard to tasks performed within a patient journey. The aim of this study was to assess the impact of a mobile task management tool on clinical workflow within an acute general surgical service by analysing data capture and usability of the application tool. The Cortex iOS application was developed to digitize patient flow and provide real-time visibility over clinical decision making and task performance. Study outcomes measured were workflow data capture for patient and staff events. Usability was assessed using an electronic survey. There were 449 unique patient journeys tracked with a total of 3072 patient events recorded. The results repository was accessed 7792 times. The participants reported that the application sped up decision making, reduced redundancy of work and improved team communication. The mode of the estimated time the application saved participants was 5-9 min/h of work. Of the 14 respondents, nine discarded their analogue methods of tracking tasks by the end of the study period. The introduction of a mobile task management system improved the working efficiency of junior clinical staff. The application allowed capture of data not previously available to hospital systems. In the future, such data will contribute to the accurate mapping of patient journeys through the health system. © 2015 Royal Australasian College of Surgeons.
A decentralized, retrofit approach to storm water management was implemented in a small suburban drainage on the basis of a voluntary reverse auction. This campaign led to the installation of 83 rain gardens and 176 rain barrels on approximately 20 percent of 350 residential prop...
Experimenting with Decentralization: The Politics of Change.
ERIC Educational Resources Information Center
Wohlstetter, Priscilla
The relationship between the political context of school districts and their choices of decentralization policy is explored in this paper. It was expected that district politics would affect decentralization policies in two ways: the form of decentralization adopted and the degree of change. The decision to decentralize in three large urban school…
Rethinking Partnerships on a Decentralized Campus
ERIC Educational Resources Information Center
Dufault, Katie H.
2017-01-01
Decentralization is an effective approach for structuring campus learning and success centers. McShane & Von Glinow (2007) describe decentralization as "an organizational model where decision authority and power are dispersed among units rather than held by a single small group of administrators" (p. 237). A decentralized structure…
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can
Optimization of tomographic reconstruction workflows on geographically distributed resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in
NASA Astrophysics Data System (ADS)
Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.
Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.
Efficient decentralized consensus protocols
NASA Technical Reports Server (NTRS)
Lakshman, T. V.; Agrawala, A. K.
1986-01-01
Decentralized consensus protocols are characterized by successive rounds of message interchanges. Protocols which achieve a consensus in one round of message interchange require O(N-squared) messages, where N is the number of participants. In this paper, a communication scheme, based on finite projective planes, which requires only O(N sq rt N) messages for each round is presented. Using this communication scheme, decentralized consensus protocols which achieve a consensus within two rounds of message interchange are developed. The protocols are symmetric, and the communication scheme does not impose any hierarchical structure. The scheme is illustrated using blocking and nonblocking commit protocols, decentralized extrema finding, and computation of the sum function.
Web-Accessible Scientific Workflow System for Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roelof Versteeg; Roelof Versteeg; Trevor Rowe
2006-03-01
We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less
Detecting distant homologies on protozoans metabolic pathways using scientific workflows.
da Cruz, Sérgio Manuel Serra; Batista, Vanessa; Silva, Edno; Tosta, Frederico; Vilela, Clarissa; Cuadrat, Rafael; Tschoeke, Diogo; Dávila, Alberto M R; Campos, Maria Luiza Machado; Mattoso, Marta
2010-01-01
Bioinformatics experiments are typically composed of programs in pipelines manipulating an enormous quantity of data. An interesting approach for managing those experiments is through workflow management systems (WfMS). In this work we discuss WfMS features to support genome homology workflows and present some relevant issues for typical genomic experiments. Our evaluation used Kepler WfMS to manage a real genomic pipeline, named OrthoSearch, originally defined as a Perl script. We show a case study detecting distant homologies on trypanomatids metabolic pathways. Our results reinforce the benefits of WfMS over script languages and point out challenges to WfMS in distributed environments.
ERIC Educational Resources Information Center
An, Ho
2012-01-01
In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in…
Strategic Alignment: Recruiting Students in a Highly Decentralized Environment
ERIC Educational Resources Information Center
Levin, Richard
2016-01-01
All enrollment managers face some level of challenge related to decentralized decision making and operations. Policies and practices can vary considerably by academic area, creating administrative complexity, restricting the scope and speed of institutional initiatives, and limiting potential efficiencies. Central attempts to standardize or…
NASA Technical Reports Server (NTRS)
Steffen, Chris
1990-01-01
An overview of the time-delay problem and the reliability problem which arise in trying to perform robotic construction operations at a remote space location are presented. The effects of the time-delay upon the control system design will be itemized. A high level overview of a decentralized method of control which is expected to perform better than the centralized approach in solving the time-delay problem is given. The lower level, decentralized, autonomous, Troter Move-Bar algorithm is also presented (Troters are coordinated independent robots). The solution of the reliability problem is connected to adding redundancy to the system. One method of adding redundancy is given.
NASA Astrophysics Data System (ADS)
Suftin, I.; Read, J. S.; Walker, J.
2013-12-01
Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file
An ontology-based framework for bioinformatics workflows.
Digiampietri, Luciano A; Perez-Alcazar, Jose de J; Medeiros, Claudia Bauzer
2007-01-01
The proliferation of bioinformatics activities brings new challenges - how to understand and organise these resources, how to exchange and reuse successful experimental procedures, and to provide interoperability among data and tools. This paper describes an effort toward these directions. It is based on combining research on ontology management, AI and scientific workflows to design, reuse and annotate bioinformatics experiments. The resulting framework supports automatic or interactive composition of tasks based on AI planning techniques and takes advantage of ontologies to support the specification and annotation of bioinformatics workflows. We validate our proposal with a prototype running on real data.
a Standardized Approach to Topographic Data Processing and Workflow Management
NASA Astrophysics Data System (ADS)
Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.
2013-12-01
An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and
Application of decentralized cooperative problem solving in dynamic flexible scheduling
NASA Astrophysics Data System (ADS)
Guan, Zai-Lin; Lei, Ming; Wu, Bo; Wu, Ya; Yang, Shuzi
1995-08-01
The object of this study is to discuss an intelligent solution to the problem of task-allocation in shop floor scheduling. For this purpose, the technique of distributed artificial intelligence (DAI) is applied. Intelligent agents (IAs) are used to realize decentralized cooperation, and negotiation is realized by using message passing based on the contract net model. Multiple agents, such as manager agents, workcell agents, and workstation agents, make game-like decisions based on multiple criteria evaluations. This procedure of decentralized cooperative problem solving makes local scheduling possible. And by integrating such multiple local schedules, dynamic flexible scheduling for the whole shop floor production can be realized.
NASA Astrophysics Data System (ADS)
Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent
2014-03-01
Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.
Household Schooling Behaviors and Decentralization.
ERIC Educational Resources Information Center
Behrman, Jere R.; King, Elizabeth M.
2001-01-01
Presents a simple framework for (1) demonstrating how households determine schooling investments through choice and voice; and (2) considering effects of decentralization on household behaviors, given information problems. Some aspects of decentralization may increase efficiency; others may be neutral or decrease efficiency. Further research is…
Six hospitals describe decentralization, cost containment, and downsizing.
Lineweaver, L A; Battle, C E; Schilling, R M; Nall, C M
1999-01-01
Decentralization, cost containment, and downsizing continue in full force as healthcare organizations continue to adapt to constant economic change. Hospitals are forced to take a second and third look at how health care is managed in order to survive. Six Northwest Florida hospitals were surveyed in an effort to explore current changes within the healthcare delivery system. This article provides both managers and staff with an overview of recent healthcare changes in an area of the country with implications for staff development.
The social control of energy: A case for the promise of decentralized solar technologies
NASA Astrophysics Data System (ADS)
Gilmer, R. W.
1980-05-01
Decentralized solar technology and centralized electric utilities were contrasted in the ways they assign property rights in capital and energy output; in the assignment of operational control; and in the means of monitoring, policing, and enforcing property rights. An analogy was drawn between the decision of an energy consumer to use decentralized solar and the decision of a firm to vertically integrate, that is, to extend the boundary of a the firm to vertically integrate, that is, to extend the boundary of the firm by making inputs or further processing output. Decentralized solar energy production offers the small energy consumer the chance to cut ties to outside suppliers--to vertically integrate energy production into the home or business. The development of this analogy provides insight into important noneconomic aspects of solar energy, and it points clearly to the lighter burdens of social management offered by decentralized solar technology.
Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean
2004-04-01
Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.
Managing Written Directives: A Software Solution to Streamline Workflow.
Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide
2017-06-01
A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases
Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization
Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...
2015-01-01
This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less
Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Ramachandran, R.; Lynnes, C.
2009-05-01
A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be
Online time and resource management based on surgical workflow time series analysis.
Maktabi, M; Neumuth, T
2017-02-01
Hospitals' effectiveness and efficiency can be enhanced by automating the resource and time management of the most cost-intensive unit in the hospital: the operating room (OR). The key elements required for the ideal organization of hospital staff and technical resources (such as instruments in the OR) are an exact online forecast of both the surgeon's resource usage and the remaining intervention time. This paper presents a novel online approach relying on time series analysis and the application of a linear time-variant system. We calculated the power spectral density and the spectrogram of surgical perspectives (e.g., used instrument) of interest to compare several surgical workflows. Considering only the use of the surgeon's right hand during an intervention, we were able to predict the remaining intervention time online with an error of 21 min 45 s ±9 min 59 s for lumbar discectomy. Furthermore, the performance of forecasting of technical resource usage in the next 20 min was calculated for a combination of spectral analysis and the application of a linear time-variant system (sensitivity: 74 %; specificity: 75 %) focusing on just the use of surgeon's instrument in question. The outstanding benefit of these methods is that the automated recording of surgical workflows has minimal impact during interventions since the whole set of surgical perspectives need not be recorded. The resulting predictions can help various stakeholders such as OR staff and hospital technicians. Moreover, reducing resource conflicts could well improve patient care.
Indirect decentralized learning control
NASA Technical Reports Server (NTRS)
Longman, Richard W.; Lee, Soo C.; Phan, M.
1992-01-01
The new field of learning control develops controllers that learn to improve their performance at executing a given task, based on experience performing this specific task. In a previous work, the authors presented a theory of indirect learning control based on use of indirect adaptive control concepts employing simultaneous identification and control. This paper develops improved indirect learning control algorithms, and studies the use of such controllers in decentralized systems. The original motivation of the learning control field was learning in robots doing repetitive tasks such as on an assembly line. This paper starts with decentralized discrete time systems, and progresses to the robot application, modeling the robot as a time varying linear system in the neighborhood of the nominal trajectory, and using the usual robot controllers that are decentralized, treating each link as if it is independent of any coupling with other links. The basic result of the paper is to show that stability of the indirect learning controllers for all subsystems when the coupling between subsystems is turned off, assures convergence to zero tracking error of the decentralized indirect learning control of the coupled system, provided that the sample time in the digital learning controller is sufficiently short.
de Carvalho, Elias Cesar Araujo; Batilana, Adelia Portero; Claudino, Wederson; Reis, Luiz Fernando Lima; Schmerling, Rafael A; Shah, Jatin; Pietrobon, Ricardo
2012-01-01
With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial
Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo
2012-01-01
Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage
Problem Management Module: An Innovative System to Improve Problem List Workflow
Hodge, Chad M.; Kuttler, Kathryn G.; Bowes, Watson A.; Narus, Scott P.
2014-01-01
Electronic problem lists are essential to modern health record systems, with a primary goal to serve as the repository of a patient’s current health issues. Additionally, coded problems can be used to drive downstream activities such as decision support, evidence-based medicine, billing, and cohort generation for research. Meaningful Use also requires use of a coded problem list. Over the course of three years, Intermountain Healthcare developed a problem management module (PMM) that provided innovative functionality to improve clinical workflow and boost problem list adoption, e.g. smart search, user customizable views, problem evolution, and problem timelines. In 23 months of clinical use, clinicians entered over 70,000 health issues, the percentage of free-text items dropped to 1.2%, completeness of problem list items increased by 14%, and more collaborative habits were initiated. PMID:25954372
Burgarella, Sarah; Cattaneo, Dario; Pinciroli, Francesco; Masseroli, Marco
2005-12-01
Improvements of bio-nano-technologies and biomolecular techniques have led to increasing production of high-throughput experimental data. Spotted cDNA microarray is one of the most diffuse technologies, used in single research laboratories and in biotechnology service facilities. Although they are routinely performed, spotted microarray experiments are complex procedures entailing several experimental steps and actors with different technical skills and roles. During an experiment, involved actors, who can also be located in a distance, need to access and share specific experiment information according to their roles. Furthermore, complete information describing all experimental steps must be orderly collected to allow subsequent correct interpretation of experimental results. We developed MicroGen, a web system for managing information and workflow in the production pipeline of spotted microarray experiments. It is constituted of a core multi-database system able to store all data completely characterizing different spotted microarray experiments according to the Minimum Information About Microarray Experiments (MIAME) standard, and of an intuitive and user-friendly web interface able to support the collaborative work required among multidisciplinary actors and roles involved in spotted microarray experiment production. MicroGen supports six types of user roles: the researcher who designs and requests the experiment, the spotting operator, the hybridisation operator, the image processing operator, the system administrator, and the generic public user who can access the unrestricted part of the system to get information about MicroGen services. MicroGen represents a MIAME compliant information system that enables managing workflow and supporting collaborative work in spotted microarray experiment production.
Decentralization, democratization, and health: the Philippine experiment.
Langran, Irene V
2011-01-01
In 1991, the Philippines joined a growing list of countries that reformed health planning through decentralization. Reformers viewed decentralization as a tool that would solve multiple problems, leading to more meaningful democracy and more effective health planning. Today, nearly two decades after the passage of decentralization legislation, questions about the effectiveness of the reforms persist. Inadequate financing, inequity, and a lack of meaningful participation remain challenges, in many ways mirroring broader weaknesses of Philippine democracy. These concerns pose questions regarding the nature of contemporary decentralization, democratization, and health planning and whether these three strategies are indeed mutually enforcing.
Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2016-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971
Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz
2017-01-01
As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, J; Wang, J; Peng, J
Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less
Theory and applications survey of decentralized control methods
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
A nonmathematical overview is presented of trends in the general area of decentralized control strategies which are suitable for hierarchical systems. Advances in decentralized system theory are closely related to advances in the so-called stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools pertaining to the classical stochastic control problem are outlined. Particular attention is devoted to pitfalls in the mathematical problem formulation for decentralized control. Major conclusions are that any purely deterministic approach to multilevel hierarchical dynamic systems is unlikely to lead to realistic theories or designs, that the flow of measurements and decisions in a decentralized system should not be instantaneous and error-free, and that delays in information exchange in a decentralized system lead to reasonable approaches to decentralized control. A mathematically precise notion of aggregating information is not yet available.
Decentralized or onsite wastewater treatment (OWT) systems have long been implicated in being a major source of N inputs to surface and ground waters and numerous regulatory bodies have promulgated strict total N (TN) effluent standards in N-sensitive areas. These standards, howe...
Lu, Xinyan
2016-01-01
There is a clear requirement for enhancing laboratory information management during early absorption, distribution, metabolism and excretion (ADME) screening. The application of a commercial laboratory information management system (LIMS) is limited by complexity, insufficient flexibility, high costs and extended timelines. An improved custom in-house LIMS for ADME screening was developed using Excel. All Excel templates were generated through macros and formulae, and information flow was streamlined as much as possible. This system has been successfully applied in task generation, process control and data management, with a reduction in both labor time and human error rates. An Excel-based LIMS can provide a simple, flexible and cost/time-saving solution for improving workflow efficiencies in early ADME screening.
DataUp: Helping manage and archive data within the researcher's workflow
NASA Astrophysics Data System (ADS)
Strasser, C.
2012-12-01
There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are lacks of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. We have developed an open-source add-in for Excel and an open source web application intended to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. The researcher does not need a prior relationship with a data repository to use DataUp; the newly implemented ONEShare repository, a DataONE member node, is available for any researcher to archive and share their data. By meeting researchers where they already work, in spreadsheets, DataUp becomes part of the researcher's workflow and data management and sharing becomes easier. Future enhancement of DataUp will rely on members of the community adopting and adapting the DataUp tools to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between Microsoft Research Connections, the University of California's California Digital Library, the Gordon and Betty Moore Foundation, and DataONE.
Disruption of Radiologist Workflow.
Kansagra, Akash P; Liu, Kevin; Yu, John-Paul J
2016-01-01
The effect of disruptions has been studied extensively in surgery and emergency medicine, and a number of solutions-such as preoperative checklists-have been implemented to enforce the integrity of critical safety-related workflows. Disruptions of the highly complex and cognitively demanding workflow of modern clinical radiology have only recently attracted attention as a potential safety hazard. In this article, we describe the variety of disruptions that arise in the reading room environment, review approaches that other specialties have taken to mitigate workflow disruption, and suggest possible solutions for workflow improvement in radiology. Copyright © 2015 Mosby, Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.
2011-07-04
A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less
Control and stabilization of decentralized systems
NASA Technical Reports Server (NTRS)
Byrnes, Christopher I.; Gilliam, David; Martin, Clyde F.
1989-01-01
Proceeding from the problem posed by the need to stabilize the motion of two helicopters maneuvering a single load, a methodology is developed for the stabilization of classes of decentralized systems based on a more algebraic approach, which involves the external symmetries of decentralized systems. Stabilizing local-feedback laws are derived for any class of decentralized systems having a semisimple algebra of symmetries; the helicopter twin-lift problem, as well as certain problems involving the stabilization of discretizations of distributed parameter problems, have just such algebras of symmetries.
On l(1): Optimal decentralized performance
NASA Technical Reports Server (NTRS)
Sourlas, Dennis; Manousiouthakis, Vasilios
1993-01-01
In this paper, the Manousiouthakis parametrization of all decentralized stabilizing controllers is employed in mathematically formulating the l(sup 1) optimal decentralized controller synthesis problem. The resulting optimization problem is infinite dimensional and therefore not directly amenable to computations. It is shown that finite dimensional optimization problems that have value arbitrarily close to the infinite dimensional one can be constructed. Based on this result, an algorithm that solves the l(sup 1) decentralized performance problems is presented. A global optimization approach to the solution of the infinite dimensional approximating problems is also discussed.
Dynamic Centralized and Decentralized Control Systems
DOT National Transportation Integrated Search
1977-09-01
This report develops a systematic method for designing suboptimal decentralized control systems. The method is then applied to the design of a decentralized controller for a freeway-corridor system. A freeway corridor is considered to be a system of ...
NASA Astrophysics Data System (ADS)
Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.
2016-12-01
We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed
Decentralized stochastic control
NASA Technical Reports Server (NTRS)
Speyer, J. L.
1980-01-01
Decentralized stochastic control is characterized by being decentralized in that the information to one controller is not the same as information to another controller. The system including the information has a stochastic or uncertain component. This complicates the development of decision rules which one determines under the assumption that the system is deterministic. The system is dynamic which means the present decisions affect future system responses and the information in the system. This circumstance presents a complex problem where tools like dynamic programming are no longer applicable. These difficulties are discussed from an intuitive viewpoint. Particular assumptions are introduced which allow a limited theory which produces mechanizable affine decision rules.
Data and Workflow Management Challenges in Global Adjoint Tomography
NASA Astrophysics Data System (ADS)
Lei, W.; Ruan, Y.; Smith, J. A.; Modrak, R. T.; Orsvuran, R.; Krischer, L.; Chen, Y.; Balasubramanian, V.; Hill, J.; Turilli, M.; Bozdag, E.; Lefebvre, M. P.; Jha, S.; Tromp, J.
2017-12-01
It is crucial to take the complete physics of wave propagation into account in seismic tomography to further improve the resolution of tomographic images. The adjoint method is an efficient way of incorporating 3D wave simulations in seismic tomography. However, global adjoint tomography is computationally expensive, requiring thousands of wavefield simulations and massive data processing. Through our collaboration with the Oak Ridge National Laboratory (ORNL) computing group and an allocation on Titan, ORNL's GPU-accelerated supercomputer, we are now performing our global inversions by assimilating waveform data from over 1,000 earthquakes. The first challenge we encountered is dealing with the sheer amount of seismic data. Data processing based on conventional data formats and processing tools (such as SAC), which are not designed for parallel systems, becomes our major bottleneck. To facilitate the data processing procedures, we designed the Adaptive Seismic Data Format (ASDF) and developed a set of Python-based processing tools to replace legacy FORTRAN-based software. These tools greatly enhance reproducibility and accountability while taking full advantage of highly parallel system and showing superior scaling on modern computational platforms. The second challenge is that the data processing workflow contains more than 10 sub-procedures, making it delicate to handle and prone to human mistakes. To reduce human intervention as much as possible, we are developing a framework specifically designed for seismic inversion based on the state-of-the art workflow management research, specifically the Ensemble Toolkit (EnTK), in collaboration with the RADICAL team from Rutgers University. Using the initial developments of the EnTK, we are able to utilize the full computing power of the data processing cluster RHEA at ORNL while keeping human interaction to a minimum and greatly reducing the data processing time. Thanks to all the improvements, we are now able to
Historical and Cultural Perspectives on Centralization/Decentralization in Continuing Education.
ERIC Educational Resources Information Center
Edelson, Paul J.
1995-01-01
Views centralization/decentralization from four perspectives: historical, as an outgrowth of professionalism, in the culture of higher education, and management theory. Suggests that some form of centralized control will always be necessary if continuing education is to function in a larger organization, but smaller units may be the wave of the…
Support for Taverna workflows in the VPH-Share cloud platform.
Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F
2017-07-01
To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
Partially Decentralized Control Architectures for Satellite Formations
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Bauer, Frank H.
2002-01-01
In a partially decentralized control architecture, more than one but less than all nodes have supervisory capability. This paper describes an approach to choosing the number of supervisors in such au architecture, based on a reliability vs. cost trade. It also considers the implications of these results for the design of navigation systems for satellite formations that could be controlled with a partially decentralized architecture. Using an assumed cost model, analytic and simulation-based results indicate that it may be cheaper to achieve a given overall system reliability with a partially decentralized architecture containing only a few supervisors, than with either fully decentralized or purely centralized architectures. Nominally, the subset of supervisors may act as centralized estimation and control nodes for corresponding subsets of the remaining subordinate nodes, and act as decentralized estimation and control peers with respect to each other. However, in the context of partially decentralized satellite formation control, the absolute positions and velocities of each spacecraft are unique, so that correlations which make estimates using only local information suboptimal only occur through common biases and process noise. Covariance and monte-carlo analysis of a simplified system show that this lack of correlation may allow simplification of the local estimators while preserving the global optimality of the maneuvers commanded by the supervisors.
Bossert, Thomas John; Mitchell, Andrew David
2011-01-01
Health sector decentralization has been widely adopted to improve delivery of health services. While many argue that institutional capacities and mechanisms of accountability required to transform decentralized decision-making into improvements in local health systems are lacking, few empirical studies exist which measure or relate together these concepts. Based on research instruments administered to a sample of 91 health sector decision-makers in 17 districts of Pakistan, this study analyzes relationships between three dimensions of decentralization: decentralized authority (referred to as "decision space"), institutional capacities, and accountability to local officials. Composite quantitative indicators of these three dimensions were constructed within four broad health functions (strategic and operational planning, budgeting, human resources management, and service organization/delivery) and on an overall/cross-function basis. Three main findings emerged. First, district-level respondents report varying degrees of each dimension despite being under a single decentralization regime and facing similar rules across provinces. Second, within dimensions of decentralization-particularly decision space and capacities-synergies exist between levels reported by respondents in one function and those reported in other functions (statistically significant coefficients of correlation ranging from ρ=0.22 to ρ=0.43). Third, synergies exist across dimensions of decentralization, particularly in terms of an overall indicator of institutional capacities (significantly correlated with both overall decision space (ρ=0.39) and accountability (ρ=0.23)). This study demonstrates that decentralization is a varied experience-with some district-level officials making greater use of decision space than others and that those who do so also tend to have more capacity to make decisions and are held more accountable to elected local officials for such choices. These findings suggest that
Digitization workflows for flat sheets and packets of plants, algae, and fungi1
Nelson, Gil; Sweeney, Patrick; Wallace, Lisa E.; Rabeler, Richard K.; Allard, Dorothy; Brown, Herrick; Carter, J. Richard; Denslow, Michael W.; Ellwood, Elizabeth R.; Germain-Aubrey, Charlotte C.; Gilbert, Ed; Gillespie, Emily; Goertzen, Leslie R.; Legler, Ben; Marchant, D. Blaine; Marsico, Travis D.; Morris, Ashley B.; Murrell, Zack; Nazaire, Mare; Neefus, Chris; Oberreiter, Shanna; Paul, Deborah; Ruhfel, Brad R.; Sasek, Thomas; Shaw, Joey; Soltis, Pamela S.; Watson, Kimberly; Weeks, Andrea; Mast, Austin R.
2015-01-01
Effective workflows are essential components in the digitization of biodiversity specimen collections. To date, no comprehensive, community-vetted workflows have been published for digitizing flat sheets and packets of plants, algae, and fungi, even though latest estimates suggest that only 33% of herbarium specimens have been digitally transcribed, 54% of herbaria use a specimen database, and 24% are imaging specimens. In 2012, iDigBio, the U.S. National Science Foundation’s (NSF) coordinating center and national resource for the digitization of public, nonfederal U.S. collections, launched several working groups to address this deficiency. Here, we report the development of 14 workflow modules with 7–36 tasks each. These workflows represent the combined work of approximately 35 curators, directors, and collections managers representing more than 30 herbaria, including 15 NSF-supported plant-related Thematic Collections Networks and collaboratives. The workflows are provided for download as Portable Document Format (PDF) and Microsoft Word files. Customization of these workflows for specific institutional implementation is encouraged. PMID:26421256
Development of the workflow kine systems for support on KAIZEN.
Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro
2012-01-01
In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.
Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Ramachandran, R.; Lynnes, C.
2009-12-01
A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of
Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System.
Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C; Parisot, Sarah; Rueckert, Daniel
2017-01-01
OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A
Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System
Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C.; Parisot, Sarah; Rueckert, Daniel
2017-01-01
OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A
Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin
2003-09-01
Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.
Decision space for health workforce management in decentralized settings: a case study in Uganda.
Alonso-Garbayo, Alvaro; Raven, Joanna; Theobald, Sally; Ssengooba, Freddie; Nattimba, Milly; Martineau, Tim
2017-11-01
The aim of this paper is to improve understanding about how district health managers perceive and use their decision space for human resource management (HRM) and how this compares with national policies and regulatory frameworks governing HRM. The study builds upon work undertaken by PERFORM Research Consortium in Uganda using action-research to strengthen human resources management in the health sector. To assess the decision space that managers have in six areas of HRM (e.g. policy, planning, remuneration and incentives, performance management, education and information) the study compares the roles allocated by Uganda's policy and regulatory frameworks with the actual room for decision-making that district health managers perceive that they have. Results show that in some areas District Health Management Team (DHMT) members make decisions beyond their conferred authority while in others they do not use all the space allocated by policy. DHMT members operate close to the boundaries defined by public policy in planning, remuneration and incentives, policy and performance management. However, they make decisions beyond their conferred authority in the area of information and do not use all the space allocated by policy in the area of education. DHMTs' decision-making capacity to manage their workforce is influenced by their own perceived authority and sometimes it is constrained by decisions made at higher levels. We can conclude that decentralization, to improve workforce performance, needs to devolve power further down from district authorities onto district health managers. DHMTs need not only more power and authority to make decisions about their workforce but also more control over resources to be able to implement these decisions. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Towards seamless workflows in agile data science
NASA Astrophysics Data System (ADS)
Klump, J. F.; Robertson, J.
2017-12-01
Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the
Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L
2009-01-01
A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18,000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods.
Decentralization or centralization: striking a balance.
Dirschel, K M
1994-09-01
An Executive Vice President for Nursing can provide the necessary link to meet diverse clinical demands when encountering centralization--decentralization decisions. Centralized communication links hospital departments giving nurses a unified voice. Decentralization acknowledges the need for diversity and achieves the right balance of uniformity through a responsive communications network.
A framework for service enterprise workflow simulation with multi-agents cooperation
NASA Astrophysics Data System (ADS)
Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun
2013-11-01
Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.
Science Gateways, Scientific Workflows and Open Community Software
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Marru, S.
2014-12-01
Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing
Effects of Decentralization on School Resources
ERIC Educational Resources Information Center
Ahlin, Asa; Mork, Eva
2008-01-01
Sweden has undertaken major national reforms of its school sector, which, consequently, has been classified as one of the most decentralized ones in the OECD. This paper investigates whether local tax base, grants, and preferences affected local school resources differently as decentralization took place. We find that municipal tax base affects…
DEWEY: the DICOM-enabled workflow engine system.
Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L
2014-06-01
Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.
Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J
2012-01-01
Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Hillyer, T. N.; Wilkins, J.
2012-12-01
The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.
NASA Astrophysics Data System (ADS)
Clempner, Julio B.
2017-01-01
This paper presents a novel analytical method for soundness verification of workflow nets and reset workflow nets, using the well-known stability results of Lyapunov for Petri nets. We also prove that the soundness property is decidable for workflow nets and reset workflow nets. In addition, we provide evidence of several outcomes related with properties such as boundedness, liveness, reversibility and blocking using stability. Our approach is validated theoretically and by a numerical example related to traffic signal-control synchronisation.
Intelligent Decentralized Control In Large Distributed Computer Systems
1988-04-01
decentralized. The goal is to find a way for the agents to coordinate their actions to maximize some index of system performance. (Our main...shown in Figure 4.13. The controller observes the environ- ment through sensors, and then may issue a command (i.e., take action ) to affect the...the Hypothesis Generator and the Belief Manager, and finally actions are issued by the Action Generator, the Experiment Generator, or the Reflex
Task Delegation Based Access Control Models for Workflow Systems
NASA Astrophysics Data System (ADS)
Gaaloul, Khaled; Charoy, François
e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.
Comparative Perspectives on Educational Decentralization: An Exercise in Contradiction?
ERIC Educational Resources Information Center
Weiler, Hans N.
1990-01-01
It is argued that policies decentralizing the governance of educational systems, although appealing in the abstract, tend to be fundamentally ambivalent and in conflict with powerful forces favoring centralization. Tensions surrounding the issue of decentralization are discussed, with emphasis on the relationship between decentralization and…
The snow system: A decentralized medical data processing system.
Bellika, Johan Gustav; Henriksen, Torje Starbo; Yigzaw, Kassaye Yitbarek
2015-01-01
Systems for large-scale reuse of electronic health record data is claimed to have the potential to transform the current health care delivery system. In principle three alternative solutions for reuse exist: centralized, data warehouse, and decentralized solutions. This chapter focuses on the decentralized system alternative. Decentralized systems may be categorized into approaches that move data to enable computations or move computations to the where data is located to enable computations. We describe a system that moves computations to where the data is located. Only this kind of decentralized solution has the capabilities to become ideal systems for reuse as the decentralized alternative enables computation and reuse of electronic health record data without moving or exposing the information to outsiders. This chapter describes the Snow system, which is a decentralized medical data processing system, its components and how it has been used. It also describes the requirements this kind of systems need to support to become sustainable and successful in recruiting voluntary participation from health institutions.
An ontological knowledge framework for adaptive medical workflow.
Dang, Jiangbo; Hedayati, Amir; Hampel, Ken; Toklu, Candemir
2008-10-01
As emerging technologies, semantic Web and SOA (Service-Oriented Architecture) allow BPMS (Business Process Management System) to automate business processes that can be described as services, which in turn can be used to wrap existing enterprise applications. BPMS provides tools and methodologies to compose Web services that can be executed as business processes and monitored by BPM (Business Process Management) consoles. Ontologies are a formal declarative knowledge representation model. It provides a foundation upon which machine understandable knowledge can be obtained, and as a result, it makes machine intelligence possible. Healthcare systems can adopt these technologies to make them ubiquitous, adaptive, and intelligent, and then serve patients better. This paper presents an ontological knowledge framework that covers healthcare domains that a hospital encompasses-from the medical or administrative tasks, to hospital assets, medical insurances, patient records, drugs, and regulations. Therefore, our ontology makes our vision of personalized healthcare possible by capturing all necessary knowledge for a complex personalized healthcare scenario involving patient care, insurance policies, and drug prescriptions, and compliances. For example, our ontology facilitates a workflow management system to allow users, from physicians to administrative assistants, to manage, even create context-aware new medical workflows and execute them on-the-fly.
Daigger, Glen T
2009-08-01
Population growth and improving standards of living, coupled with dramatically increased urbanization, are placing increased pressures on available water resources, necessitating new approaches to urban water management. The tradition linear "take, make, waste" approach to managing water increasingly is proving to be unsustainable, as it is leading to water stress (insufficient water supplies), unsustainable resource (energy and chemicals) consumption, the dispersion of nutrients into the aquatic environment (especially phosphorus), and financially unstable utilities. Different approaches are needed to achieve economic, environmental, and social sustainability. Fortunately, a toolkit consisting of stormwater management/rainwater harvesting, water conservation, water reclamation and reuse, energy management, nutrient recovery, and source separation is available to allow more closed-loop urban water and resource management systems to be developed and implemented. Water conservation and water reclamation and reuse (multiple uses) are becoming commonplace in numerous water-short locations. Decentralization, enabled by new, high-performance treatment technologies and distributed stormwater management/rainwater harvesting, is furthering this transition. Likewise, traditional approaches to residuals management are evolving, as higher levels of energy recovery are desired, and nutrient recovery and reuse is to be enhanced. A variety of factors affect selection of the optimum approach for a particular urban area, including local hydrology, available water supplies, water demands, local energy and nutrient-management situations, existing infrastructure, and utility governance structure. A proper approach to economic analysis is critical to determine the most sustainable solutions. Stove piping (i.e., separate management of drinking, storm, and waste water) within the urban water and resource management profession must be eliminated. Adoption of these new approaches to urban
Improving data collection, documentation, and workflow in a dementia screening study.
Read, Kevin B; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I; Galvin, James E; Surkis, Alisa
2017-04-01
A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies' data collection, entry, and processing workflows. The librarians' role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library's broader user community.
Decentralized Control of Autonomous Vehicles
2003-01-01
Autonomous Vehicles by John S. Baras, Xiaobo Tan, Pedram Hovareshti CSHCN TR 2003-8 (ISR TR 2003-14) Report Documentation Page Form ApprovedOMB No. 0704...AND SUBTITLE Decentralized Control of Autonomous Vehicles 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Decentralized Control of Autonomous Vehicles ∗ John S. Baras, Xiaobo Tan, and Pedram
Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H.; Preston, Kenzie L.
2009-01-01
Issues A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients’ treatment needs and accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with provision of seamless methods for exporting, mining, and querying the data. Approach We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialized applications: the Automated Contingency Management (ACM) system for delivery of behavioral interventions, the Transactional Electronic Diary (TED) system for management of behavioral assessments, and the Protocol Workflow System (PWS) for computerized workflow automation and guidance of each participant’s daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorized staff. Key Findings ACM and TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80-patient capacity having an annual average of 18,000 patient-visits and 7,300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarize participant-safety data for research oversight. Implications and conclusion When developed in consultation with end users, automation in treatment-research clinics can enable more efficient operations, better communication among staff, and expansions in research methods. PMID:19320669
Decentralization in Indonesia: lessons from cost recovery rate of district hospitals.
Maharani, Asri; Femina, Devi; Tampubolon, Gindo
2015-07-01
In 1991, Indonesia began a process of decentralization in the health sector which had implications for the country's public hospitals. The public hospitals were given greater authority to manage their own personnel, finance and procurement, with which they were allowed to operate commercial sections in addition to offering public services. These public services are subsidized by the government, although patients still pay certain proportion of fees. The main objectives of health sector decentralization are to increase the ability of public hospitals to cover their costs and to reduce government subsidies. This study investigates the consequences of decentralization on cost recovery rate of public hospitals at district level. We examine five service units (inpatient, outpatient, operating room, laboratory and radiology) in three public hospitals. We find that after 20 years of decentralization, district hospitals still depend on government subsidies, demonstrated by the fact that the cost recovery rate of most service units is less than one. The commercial sections fail to play their role as revenue generator as they are still subsidized by the government. We also find that the bulk of costs are made up of staff salaries and incentives in all units except radiology. As this study constitutes exploratory research, further investigation is needed to find out the reasons behind these results. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.
Educational Decentralization, Public Spending, and Social Justice in Nigeria
ERIC Educational Resources Information Center
Geo-Jaja, Macleans A.
2006-01-01
This study situates the process of educational decentralization in the narrower context of social justice. Its main object, however, is to analyze the implications of decentralization for strategies of equity and social justice in Nigeria. It starts from the premise that the early optimism that supported decentralization as an efficient and…
Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J
2012-01-01
Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881
Bledsoe, Sarah; Van Buskirk, Alex; Falconer, R James; Hollon, Andrew; Hoebing, Wendy; Jokic, Sladan
2018-02-01
The effectiveness of barcode-assisted medication preparation (BCMP) technology on detecting oral liquid dose preparation errors. From June 1, 2013, through May 31, 2014, a total of 178,344 oral doses were processed at Children's Mercy, a 301-bed pediatric hospital, through an automated workflow management system. Doses containing errors detected by the system's barcode scanning system or classified as rejected by the pharmacist were further reviewed. Errors intercepted by the barcode-scanning system were classified as (1) expired product, (2) incorrect drug, (3) incorrect concentration, and (4) technological error. Pharmacist-rejected doses were categorized into 6 categories based on the root cause of the preparation error: (1) expired product, (2) incorrect concentration, (3) incorrect drug, (4) incorrect volume, (5) preparation error, and (6) other. Of the 178,344 doses examined, 3,812 (2.1%) errors were detected by either the barcode-assisted scanning system (1.8%, n = 3,291) or a pharmacist (0.3%, n = 521). The 3,291 errors prevented by the barcode-assisted system were classified most commonly as technological error and incorrect drug, followed by incorrect concentration and expired product. Errors detected by pharmacists were also analyzed. These 521 errors were most often classified as incorrect volume, preparation error, expired product, other, incorrect drug, and incorrect concentration. BCMP technology detected errors in 1.8% of pediatric oral liquid medication doses prepared in an automated workflow management system, with errors being most commonly attributed to technological problems or incorrect drugs. Pharmacists rejected an additional 0.3% of studied doses. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Decentralization: Another Perspective
ERIC Educational Resources Information Center
Chapman, Robin
1973-01-01
This paper attempts to pursue the centralization-decentralization dilemma. A setting for this discussion is provided by noting some of the uses of terminology, followed by a consideration of inherent difficulties in conceptualizing. (Author)
Metaworkflows and Workflow Interoperability for Heliophysics
NASA Astrophysics Data System (ADS)
Pierantoni, Gabriele; Carley, Eoin P.
2014-06-01
Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They
PGen: large-scale genomic variations analysis workflow and browser in SoyKB.
Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti
2016-10-06
With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most
NASA Astrophysics Data System (ADS)
Schäfer, Benjamin; Matthiae, Moritz; Timme, Marc; Witthaut, Dirk
2015-01-01
Stable operation of complex flow and transportation networks requires balanced supply and demand. For the operation of electric power grids—due to their increasing fraction of renewable energy sources—a pressing challenge is to fit the fluctuations in decentralized supply to the distributed and temporally varying demands. To achieve this goal, common smart grid concepts suggest to collect consumer demand data, centrally evaluate them given current supply and send price information back to customers for them to decide about usage. Besides restrictions regarding cyber security, privacy protection and large required investments, it remains unclear how such central smart grid options guarantee overall stability. Here we propose a Decentral Smart Grid Control, where the price is directly linked to the local grid frequency at each customer. The grid frequency provides all necessary information about the current power balance such that it is sufficient to match supply and demand without the need for a centralized IT infrastructure. We analyze the performance and the dynamical stability of the power grid with such a control system. Our results suggest that the proposed Decentral Smart Grid Control is feasible independent of effective measurement delays, if frequencies are averaged over sufficiently large time intervals.
A virtual data language and system for scientific workflow management in data grid environments
NASA Astrophysics Data System (ADS)
Zhao, Yong
With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.
Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P
2016-02-19
The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC
Decentralized Planning for Autonomous Agents Cooperating in Complex Missions
2010-09-01
Consensus - based decentralized auctions for robust task allocation ," IEEE Transactions on Robotics...Robotics, vol. 24, pp. 209-222, 2006. [44] H.-L. Choi, L. Brunet, and J. P. How, " Consensus - based decentralized auctions for robust task allocation ...2003. 123 [31] L. Brunet, " Consensus - Based Auctions for Decentralized Task Assignment," Master’s thesis, Dept.
Workflows for Full Waveform Inversions
NASA Astrophysics Data System (ADS)
Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas
2017-04-01
Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.
Decentralized Patrolling Under Constraints in Dynamic Environments.
Shaofei Chen; Feng Wu; Lincheng Shen; Jing Chen; Ramchurn, Sarvapali D
2016-12-01
We investigate a decentralized patrolling problem for dynamic environments where information is distributed alongside threats. In this problem, agents obtain information at a location, but may suffer attacks from the threat at that location. In a decentralized fashion, each agent patrols in a designated area of the environment and interacts with a limited number of agents. Therefore, the goal of these agents is to coordinate to gather as much information as possible while limiting the damage incurred. Hence, we model this class of problem as a transition-decoupled partially observable Markov decision process with health constraints. Furthermore, we propose scalable decentralized online algorithms based on Monte Carlo tree search and a factored belief vector. We empirically evaluate our algorithms on decentralized patrolling problems and benchmark them against the state-of-the-art online planning solver. The results show that our approach outperforms the state-of-the-art by more than 56% for six agents patrolling problems and can scale up to 24 agents in reasonable time.
Decentralized indirect methods for learning automata games.
Tilak, Omkar; Martin, Ryan; Mukhopadhyay, Snehasis
2011-10-01
We discuss the application of indirect learning methods in zero-sum and identical payoff learning automata games. We propose a novel decentralized version of the well-known pursuit learning algorithm. Such a decentralized algorithm has significant computational advantages over its centralized counterpart. The theoretical study of such a decentralized algorithm requires the analysis to be carried out in a nonstationary environment. We use a novel bootstrapping argument to prove the convergence of the algorithm. To our knowledge, this is the first time that such analysis has been carried out for zero-sum and identical payoff games. Extensive simulation studies are reported, which demonstrate the proposed algorithm's fast and accurate convergence in a variety of game scenarios. We also introduce the framework of partial communication in the context of identical payoff games of learning automata. In such games, the automata may not communicate with each other or may communicate selectively. This comprehensive framework has the capability to model both centralized and decentralized games discussed in this paper.
Extended Decentralized Linear-Quadratic-Gaussian Control
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2000-01-01
A straightforward extension of a solution to the decentralized linear-Quadratic-Gaussian problem is proposed that allows its use for commonly encountered classes of problems that are currently solved with the extended Kalman filter. This extension allows the system to be partitioned in such a way as to exclude the nonlinearities from the essential algebraic relationships that allow the estimation and control to be optimally decentralized.
Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng
2015-05-08
Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.
Boes, Peter; Ho, Meng Wei; Li, Zuofeng
2015-01-01
Image‐guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X‐rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post‐treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state‐of‐the‐art Web technologies, a domain model closely matching the workflow, a database‐supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model‐View‐Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client‐side technologies, such as jQuery, jQuery Plug‐ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process. PACS number: 87 PMID:26103504
Decentralization and primary health care: some negative implications in developing countries.
Collins, C; Green, A
1994-01-01
Decentralization is a highly popular concept, being a key element of Primary Health Care policies. There are, however, certain negative implications of decentralization that must be taken into account. These are analyzed in this article with particular reference to developing countries. The authors criticize the tendency for decentralization to be associated with state limitations, and discuss the dilemma of relating decentralization, which is the enhancement of the different, to equity, which is the promotion of equivalence. Those situations in which decentralization can strengthen political domination are described. The authors conclude by setting out a checklist of warning questions and issues to be taken into account to ensure that decentralization genuinely facilitates the Primary Health Care orientation of health policy.
Dynamic reusable workflows for ocean science
Signell, Richard; Fernandez, Filipe; Wilcox, Kyle
2016-01-01
Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic
Transition-Independent Decentralized Markov Decision Processes
NASA Technical Reports Server (NTRS)
Becker, Raphen; Silberstein, Shlomo; Lesser, Victor; Goldman, Claudia V.; Morris, Robert (Technical Monitor)
2003-01-01
There has been substantial progress with formal models for sequential decision making by individual agents using the Markov decision process (MDP). However, similar treatment of multi-agent systems is lacking. A recent complexity result, showing that solving decentralized MDPs is NEXP-hard, provides a partial explanation. To overcome this complexity barrier, we identify a general class of transition-independent decentralized MDPs that is widely applicable. The class consists of independent collaborating agents that are tied up by a global reward function that depends on both of their histories. We present a novel algorithm for solving this class of problems and examine its properties. The result is the first effective technique to solve optimally a class of decentralized MDPs. This lays the foundation for further work in this area on both exact and approximate solutions.
A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.
Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary
2017-12-01
Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.
From the desktop to the grid: scalable bioinformatics via workflow conversion.
de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver
2016-03-12
Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We
Visualizing the Collective Learner through Decentralized Networks
ERIC Educational Resources Information Center
Castro, Juan Carlos
2015-01-01
Understandings of decentralized networks are increasingly used to describe a way to structure curriculum and pedagogy. It is often understood as a structural model to organize pedagogical and curricular relationships in which there is no center. While this is important it also bears introducing into the discourse that decentralized networks are…
Taking stock of decentralized disaster risk reduction in Indonesia
NASA Astrophysics Data System (ADS)
Grady, Anthony; Gersonius, Berry; Makarigakis, Alexandros
2016-09-01
The Sendai Framework, which outlines the global course on disaster risk reduction until 2030, places strong importance on the role of local government in disaster risk reduction. An aim of decentralization is to increase the influence and authority of local government in decision making. Yet, there is limited empirical evidence of the extent, character and effects of decentralization in current disaster risk reduction implementation, and of the barriers that are most critical to this. This paper evaluates decentralization in relation to disaster risk reduction in Indonesia, chosen for its recent actions to decentralize governance of DRR coupled with a high level of disaster risk. An analytical framework was developed to evaluate the various dimensions of decentralized disaster risk reduction, which necessitated the use of a desk study, semi-structured interviews and a gap analysis. Key barriers to implementation in Indonesia included: capacity gaps at lower institutional levels, low compliance with legislation, disconnected policies, issues in communication and coordination and inadequate resourcing. However, any of these barriers are not unique to disaster risk reduction, and similar barriers have been observed for decentralization in other developing countries in other public sectors.
[Decentralization of psychiatric health service].
Dabrowski, S
1996-01-01
The article discusses two stages of de-centralization of psychiatric hospitals: the first consists in further division into sub-districts, the second one includes successive establishment of psychiatric wards in general hospitals. With the growth of their number these wards are to take over more and more general psychiatric tasks from the specialized psychiatric hospitals. These wards will not substitute psychiatric hospitals completely. The hospitals, though decreasing in size and number, will be a necessary element of the de-centralized and versatile psychiatric care for a long time to come.
Balancing Officer Community Manpower through Decentralization: Granular Programming Revisited (1REV)
2017-08-01
supply-demand imbalances Economic theory identifies costs and benefits associated with decentralization. On the benefits side, decentralized decision...patterns rather than costs . Granular programming as a decentralized, market-based initiative The costs and benefits of decentralized (instead of...paygrade-specific rates were based on average MPN costs by paygrade. The benefits of this approach to granular programming are that it is conceptually
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-01-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600
Peeling the Onion: Why Centralized Control / Decentralized Execution Works
2014-04-01
March–April 2014 Air & Space Power Journal | 24 Feature Peeling the Onion Why Centralized Control / Decentralized Execution Works Lt Col Alan Docauer...DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Peeling the Onion : Why Centralized Control / Decentralized Execution Works 5a...Air & Space Power Journal | 25 Docauer Peeling the Onion Feature What Is Centralized Control / Decentralized Execution? Emerging in the aftermath of
Fuzzy Adaptive Decentralized Optimal Control for Strict Feedback Nonlinear Large-Scale Systems.
Sun, Kangkang; Sui, Shuai; Tong, Shaocheng
2018-04-01
This paper considers the optimal decentralized fuzzy adaptive control design problem for a class of interconnected large-scale nonlinear systems in strict feedback form and with unknown nonlinear functions. The fuzzy logic systems are introduced to learn the unknown dynamics and cost functions, respectively, and a state estimator is developed. By applying the state estimator and the backstepping recursive design algorithm, a decentralized feedforward controller is established. By using the backstepping decentralized feedforward control scheme, the considered interconnected large-scale nonlinear system in strict feedback form is changed into an equivalent affine large-scale nonlinear system. Subsequently, an optimal decentralized fuzzy adaptive control scheme is constructed. The whole optimal decentralized fuzzy adaptive controller is composed of a decentralized feedforward control and an optimal decentralized control. It is proved that the developed optimal decentralized controller can ensure that all the variables of the control system are uniformly ultimately bounded, and the cost functions are the smallest. Two simulation examples are provided to illustrate the validity of the developed optimal decentralized fuzzy adaptive control scheme.
Improving data collection, documentation, and workflow in a dementia screening study
Read, Kevin B.; LaPolla, Fred Willie Zametkin; Tolea, Magdalena I.; Galvin, James E.; Surkis, Alisa
2017-01-01
Background A clinical study team performing three multicultural dementia screening studies identified the need to improve data management practices and facilitate data sharing. A collaboration was initiated with librarians as part of the National Library of Medicine (NLM) informationist supplement program. The librarians identified areas for improvement in the studies’ data collection, entry, and processing workflows. Case Presentation The librarians’ role in this project was to meet needs expressed by the study team around improving data collection and processing workflows to increase study efficiency and ensure data quality. The librarians addressed the data collection, entry, and processing weaknesses through standardizing and renaming variables, creating an electronic data capture system using REDCap, and developing well-documented, reproducible data processing workflows. Conclusions NLM informationist supplements provide librarians with valuable experience in collaborating with study teams to address their data needs. For this project, the librarians gained skills in project management, REDCap, and understanding of the challenges and specifics of a clinical research study. However, the time and effort required to provide targeted and intensive support for one study team was not scalable to the library’s broader user community. PMID:28377680
Hou, Kun-Mean; Zhang, Zhan
2017-01-01
Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem. PMID:29120357
Zhou, Peng; Zuo, Decheng; Hou, Kun-Mean; Zhang, Zhan
2017-11-09
Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem.
Structuring Clinical Workflows for Diabetes Care
Lasierra, N.; Oberbichler, S.; Toma, I.; Fensel, A.; Hoerbst, A.
2014-01-01
Summary Background Electronic health records (EHRs) play an important role in the treatment of chronic diseases such as diabetes mellitus. Although the interoperability and selected functionality of EHRs are already addressed by a number of standards and best practices, such as IHE or HL7, the majority of these systems are still monolithic from a user-functionality perspective. The purpose of the OntoHealth project is to foster a functionally flexible, standards-based use of EHRs to support clinical routine task execution by means of workflow patterns and to shift the present EHR usage to a more comprehensive integration concerning complete clinical workflows. Objectives The goal of this paper is, first, to introduce the basic architecture of the proposed OntoHealth project and, second, to present selected functional needs and a functional categorization regarding workflow-based interactions with EHRs in the domain of diabetes. Methods A systematic literature review regarding attributes of workflows in the domain of diabetes was conducted. Eligible references were gathered and analyzed using a qualitative content analysis. Subsequently, a functional workflow categorization was derived from diabetes-specific raw data together with existing general workflow patterns. Results This paper presents the design of the architecture as well as a categorization model which makes it possible to describe the components or building blocks within clinical workflows. The results of our study lead us to identify basic building blocks, named as actions, decisions, and data elements, which allow the composition of clinical workflows within five identified contexts. Conclusions The categorization model allows for a description of the components or building blocks of clinical workflows from a functional view. PMID:25024765
[Mechanisms for allocating financial resources after decentralization in the state of Jalisco].
Pérez-Núñez, Ricardo; Arredondo-López, Armando; Pelcastre, Blanca
2006-01-01
To analyze, from the decision maker's perspective, the financial resource allocation process of the health services of the state of Jalisco (SSJ, per its abbreviation in spanish), within the context of decentralization. Through a qualitative approximation using semi-structured individual interviews of key personnel in managerial positions as the method for compiling information, the experience of the SSJ in financial resource allocation was documented. From September to November 2003, the perception of managers and administrators regarding their level of autonomy in decision-making was explored as well as the process they follow for the allocation of financial resources, in order to identify the criteria they use and their justifications. From the point of view of decision-makers, autonomy of the SSJ has increased considerably since decentralization was implemented, although the degree of decision-making freedom remains limited due mainly to high adminstrative costs associated with salaries. In this sense, the implications attributable to labor situations that are still centralized are evident. Some innovative systems for financial resource allocation have been established in the SSJ for the sanitary regions and hospitals based upon administrative-managerial and productivity incentives. Adjustments were also made for degree of marginalization and population lag, under the equity criterion. General work conditions and decision-making autonomy of the sanitary regions constitute outstanding aspects pending decentralization. Although decentralization has granted more autonomy to the SSJ, the level of decision-making freedom for allocating financial resources has been held within the highest hierarchical levels.
Impacts of a Large Decentralized Telepathology Network in Canada.
Pare, Guy; Meyer, Julien; Trudel, Marie-Claude; Tetu, Bernard
2016-03-01
Telepathology is a fast growing segment of the telemedicine field. As of yet, no prior research has investigated the impacts of large decentralized telepathology projects on patients, clinicians, and healthcare systems. This study aims to fill this gap. We report a benefits evaluation study of a large decentralized telepathology project deployed in Eastern Quebec, Canada whose main objective is to provide continuous coverage of intraoperative consultations in remote hospitals without pathologists on-site. The project involves 18 hospitals, making it one of the largest telepathology networks in the world. We conducted 43 semistructured interviews with several telepathology users and hospital managers. Archival data on the impacts of the telepathology project (e.g., number of service disruptions, average time between initial diagnosis and surgery) were also extracted and analyzed. Our findings show that no service disruptions were recorded in hospitals without pathologists following the deployment of telepathology. Surgeons noted that the use of intraoperative consultations enabled by telepathology helped avoid second surgeries and improved accessibility to care services. Telepathology was also perceived by our respondents as having positive impacts on the remote hospitals' ability to retain and recruit surgeons. The observed benefits should not leave the impression that implementing telepathology is a trivial matter. Indeed, many technical, human, and organizational challenges may be encountered. Telepathology can be highly useful in regional hospitals that do not have a pathologist on-site. More research is needed to investigate the challenges and benefits associated with large decentralized telepathology networks.
Disturbance decoupling, decentralized control and the Riccati equation
NASA Technical Reports Server (NTRS)
Garzia, M. R.; Loparo, K. A.; Martin, C. F.
1981-01-01
The disturbance decoupling and optimal decentralized control problems are looked at using identical mathematical techniques. A statement of the problems and the development of their solution approach is presented. Preliminary results are given for the optimal decentralized control problem.
Clinic Workflow Simulations using Secondary EHR Data
Hribar, Michelle R.; Biermann, David; Read-Brown, Sarah; Reznick, Leah; Lombardi, Lorinna; Parikh, Mansi; Chamberlain, Winston; Yackel, Thomas R.; Chiang, Michael F.
2016-01-01
Clinicians today face increased patient loads, decreased reimbursements and potential negative productivity impacts of using electronic health records (EHR), but have little guidance on how to improve clinic efficiency. Discrete event simulation models are powerful tools for evaluating clinical workflow and improving efficiency, particularly when they are built from secondary EHR timing data. The purpose of this study is to demonstrate that these simulation models can be used for resource allocation decision making as well as for evaluating novel scheduling strategies in outpatient ophthalmology clinics. Key findings from this study are that: 1) secondary use of EHR timestamp data in simulation models represents clinic workflow, 2) simulations provide insight into the best allocation of resources in a clinic, 3) simulations provide critical information for schedule creation and decision making by clinic managers, and 4) simulation models built from EHR data are potentially generalizable. PMID:28269861
Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...
2016-10-06
The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less
Sotiropoulos, A; Vourka, I; Erotokritou, A; Novakovic, J; Panaretou, V; Vakalis, S; Thanos, T; Moustakas, K; Malamis, D
2016-06-01
The results of the demonstration of an innovative household biowaste management and treatment scheme established in two Greek Municipalities for the production of lignocellulosic ethanol using dehydrated household biowaste as a substrate, are presented within this research. This is the first time that biowaste drying was tested at a decentralized level for the production of ethanol using the Simultaneous Saccharification and Fermentation (SSF) process, at a pilot scale in Greece. The decentralized biowaste drying method proved that the household biowaste mass and volume reduction may reach 80% through the dehydration process used. The chemical characteristics related to lignocellulosic ethanol production have proved to differ substantially between seasons thus; special attention should be given to the process applied for ethanol production mainly regarding the enzyme quality and quantity used during the pretreatment stage. The maximum ethanol production achieved was 29.12g/L, approximately 60% of the maximum theoretical yield based on the substrate's sugar content. The use of the decentralized waste drying as an alternative approach for household biowaste minimization and the production of second generation ethanol is considered to be a promising approach for efficient biowaste management and treatment in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.
Linear decentralized learning control
NASA Technical Reports Server (NTRS)
Lee, Soo C.; Longman, Richard W.; Phan, Minh
1992-01-01
The new field of learning control develops controllers that learn to improve their performance at executing a given task, based on experience performing this task. The simplest forms of learning control are based on the same concept as integral control, but operating in the domain of the repetitions of the task. This paper studies the use of such controllers in a decentralized system, such as a robot with the controller for each link acting independently. The basic result of the paper is to show that stability of the learning controllers for all subsystems when the coupling between subsystems is turned off, assures stability of the decentralized learning in the coupled system, provided that the sample time in the digital learning controller is sufficiently short.
COINSTAC: Decentralizing the future of brain imaging analysis
Ming, Jing; Verner, Eric; Sarwate, Anand; Kelly, Ross; Reed, Cory; Kahleck, Torran; Silva, Rogers; Panta, Sandeep; Turner, Jessica; Plis, Sergey; Calhoun, Vince
2017-01-01
In the era of Big Data, sharing neuroimaging data across multiple sites has become increasingly important. However, researchers who want to engage in centralized, large-scale data sharing and analysis must often contend with problems such as high database cost, long data transfer time, extensive manual effort, and privacy issues for sensitive data. To remove these barriers to enable easier data sharing and analysis, we introduced a new, decentralized, privacy-enabled infrastructure model for brain imaging data called COINSTAC in 2016. We have continued development of COINSTAC since this model was first introduced. One of the challenges with such a model is adapting the required algorithms to function within a decentralized framework. In this paper, we report on how we are solving this problem, along with our progress on several fronts, including additional decentralized algorithms implementation, user interface enhancement, decentralized regression statistic calculation, and complete pipeline specifications. PMID:29123643
SMITH: a LIMS for handling next-generation sequencing workflows
2014-01-01
workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. Conclusions SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis. PMID:25471934
SMITH: a LIMS for handling next-generation sequencing workflows.
Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko
2014-01-01
Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available
Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat
2016-11-28
At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.
Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat
2016-03-01
At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.
A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Michelle M.; Wu, Chase Q.
2013-11-07
Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less
Educational decentralization, public spending, and social justice in Nigeria
NASA Astrophysics Data System (ADS)
Geo-Jaja, Macleans A.
2007-01-01
This study situates the process of educational decentralization in the narrower context of social justice. Its main object, however, is to analyze the implications of decentralization for strategies of equity and social justice in Nigeria. It starts from the premise that the early optimism that supported decentralization as an efficient and effective educational reform tool has been disappointed. The author maintains that decentralization — on its own — cannot improve education service delivery, the capacities of subordinate governments, or the integration of social policy in broader development goals. If the desired goals are to be met, public spending must be increased, greater tax revenues must be secured, and macro-economic stabilization must be achieved without re-instituting the welfare state.
Blockchain Based Decentralized Management of Demand Response Programs in Smart Energy Grids.
Pop, Claudia; Cioara, Tudor; Antal, Marcel; Anghel, Ionut; Salomie, Ioan; Bertoncini, Massimo
2018-01-09
In this paper, we investigate the use of decentralized blockchain mechanisms for delivering transparent, secure, reliable, and timely energy flexibility, under the form of adaptation of energy demand profiles of Distributed Energy Prosumers, to all the stakeholders involved in the flexibility markets (Distribution System Operators primarily, retailers, aggregators, etc.). In our approach, a blockchain based distributed ledger stores in a tamper proof manner the energy prosumption information collected from Internet of Things smart metering devices, while self-enforcing smart contracts programmatically define the expected energy flexibility at the level of each prosumer, the associated rewards or penalties, and the rules for balancing the energy demand with the energy production at grid level. Consensus based validation will be used for demand response programs validation and to activate the appropriate financial settlement for the flexibility providers. The approach was validated using a prototype implemented in an Ethereum platform using energy consumption and production traces of several buildings from literature data sets. The results show that our blockchain based distributed demand side management can be used for matching energy demand and production at smart grid level, the demand response signal being followed with high accuracy, while the amount of energy flexibility needed for convergence is reduced.
Blockchain Based Decentralized Management of Demand Response Programs in Smart Energy Grids
Pop, Claudia; Cioara, Tudor; Antal, Marcel; Anghel, Ionut; Salomie, Ioan; Bertoncini, Massimo
2018-01-01
In this paper, we investigate the use of decentralized blockchain mechanisms for delivering transparent, secure, reliable, and timely energy flexibility, under the form of adaptation of energy demand profiles of Distributed Energy Prosumers, to all the stakeholders involved in the flexibility markets (Distribution System Operators primarily, retailers, aggregators, etc.). In our approach, a blockchain based distributed ledger stores in a tamper proof manner the energy prosumption information collected from Internet of Things smart metering devices, while self-enforcing smart contracts programmatically define the expected energy flexibility at the level of each prosumer, the associated rewards or penalties, and the rules for balancing the energy demand with the energy production at grid level. Consensus based validation will be used for demand response programs validation and to activate the appropriate financial settlement for the flexibility providers. The approach was validated using a prototype implemented in an Ethereum platform using energy consumption and production traces of several buildings from literature data sets. The results show that our blockchain based distributed demand side management can be used for matching energy demand and production at smart grid level, the demand response signal being followed with high accuracy, while the amount of energy flexibility needed for convergence is reduced. PMID:29315250
Routine Digital Pathology Workflow: The Catania Experience.
Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana
2017-01-01
Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
Centralized versus decentralized decision-making for recycled material flows.
Hong, I-Hsuan; Ammons, Jane C; Realff, Matthew J
2008-02-15
A reverse logistics system is a network of transportation logistics and processing functions that collect, consolidate, refurbish, and demanufacture end-of-life products. This paper examines centralized and decentralized models of decision-making for material flows and associated transaction prices in reverse logistics networks. We compare the application of a centralized model for planning reverse production systems, where a single planner is acquainted with all of the system information and has the authority to determine decision variables for the entire system, to a decentralized approach. In the decentralized approach, the entities coordinate between tiers of the system using a parametrized flow function and compete within tiers based on reaching a price equilibrium. We numerically demonstrate the increase in the total net profit of the centralized system relative to the decentralized one. This implies that one may overestimate the system material flows and profit if the system planner utilizes a centralized viewto predict behaviors of independent entities in the system and that decentralized contract mechanisms will require careful design to avoid losses in the efficiency and scope of these systems.
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.
Flexible workflow sharing and execution services for e-scientists
NASA Astrophysics Data System (ADS)
Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely
2013-04-01
The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already
Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum; Field, Aaron; Wiegmann, Douglas; Yu, John-Paul J
2017-04-01
The study aimed to assess perceptions of reading room workflow and the impact separating image-interpretive and nonimage-interpretive task workflows can have on radiologist perceptions of workplace disruptions, workload, and overall satisfaction. A 14-question survey instrument was developed to measure radiologist perceptions of workplace interruptions, satisfaction, and workload prior to and following implementation of separate image-interpretive and nonimage-interpretive reading room workflows. The results were collected over 2 weeks preceding the intervention and 2 weeks following the end of the intervention. The results were anonymized and analyzed using univariate analysis. A total of 18 people responded to the preintervention survey: 6 neuroradiology fellows and 12 attending neuroradiologists. Fifteen people who were then present for the 1-month intervention period responded to the postintervention survey. Perceptions of workplace disruptions, image interpretation, quality of trainee education, ability to perform nonimage-interpretive tasks, and quality of consultations (P < 0.0001) all improved following the intervention. Mental effort and workload also improved across all assessment domains, as did satisfaction with quality of image interpretation and consultative work. Implementation of parallel dedicated image-interpretive and nonimage-interpretive workflows may improve markers of radiologist perceptions of workplace satisfaction. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Towards a Decentralized Magnetic Indoor Positioning System
Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg
2015-01-01
Decentralized magnetic indoor localization is a sophisticated method for processing sampled magnetic data directly on a mobile station (MS), thereby decreasing or even avoiding the need for communication with the base station. In contrast to central-oriented positioning systems, which transmit raw data to a base station, decentralized indoor localization pushes application-level knowledge into the MS. A decentralized position solution has thus a strong feasibility to increase energy efficiency and to prolong the lifetime of the MS. In this article, we present a complete architecture and an implementation for a decentralized positioning system. Furthermore, we introduce a technique for the synchronization of the observed magnetic field on the MS with the artificially-generated magnetic field from the coils. Based on real-time clocks (RTCs) and a preemptive operating system, this method allows a stand-alone control of the coils and a proper assignment of the measured magnetic fields on the MS. A stand-alone control and synchronization of the coils and the MS have an exceptional potential to implement a positioning system without the need for wired or wireless communication and enable a deployment of applications for rescue scenarios, like localization of miners or firefighters. PMID:26690145
Towards a Decentralized Magnetic Indoor Positioning System.
Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg
2015-12-04
Decentralized magnetic indoor localization is a sophisticated method for processing sampled magnetic data directly on a mobile station (MS), thereby decreasing or even avoiding the need for communication with the base station. In contrast to central-oriented positioning systems, which transmit raw data to a base station, decentralized indoor localization pushes application-level knowledge into the MS. A decentralized position solution has thus a strong feasibility to increase energy efficiency and to prolong the lifetime of the MS. In this article, we present a complete architecture and an implementation for a decentralized positioning system. Furthermore, we introduce a technique for the synchronization of the observed magnetic field on the MS with the artificially-generated magnetic field from the coils. Based on real-time clocks (RTCs) and a preemptive operating system, this method allows a stand-alone control of the coils and a proper assignment of the measured magnetic fields on the MS. A stand-alone control and synchronization of the coils and the MS have an exceptional potential to implement a positioning system without the need for wired or wireless communication and enable a deployment of applications for rescue scenarios, like localization of miners or firefighters.
Decentralized Multisensory Information Integration in Neural Systems.
Zhang, Wen-Hao; Chen, Aihua; Rasch, Malte J; Wu, Si
2016-01-13
How multiple sensory cues are integrated in neural circuitry remains a challenge. The common hypothesis is that information integration might be accomplished in a dedicated multisensory integration area receiving feedforward inputs from the modalities. However, recent experimental evidence suggests that it is not a single multisensory brain area, but rather many multisensory brain areas that are simultaneously involved in the integration of information. Why many mutually connected areas should be needed for information integration is puzzling. Here, we investigated theoretically how information integration could be achieved in a distributed fashion within a network of interconnected multisensory areas. Using biologically realistic neural network models, we developed a decentralized information integration system that comprises multiple interconnected integration areas. Studying an example of combining visual and vestibular cues to infer heading direction, we show that such a decentralized system is in good agreement with anatomical evidence and experimental observations. In particular, we show that this decentralized system can integrate information optimally. The decentralized system predicts that optimally integrated information should emerge locally from the dynamics of the communication between brain areas and sheds new light on the interpretation of the connectivity between multisensory brain areas. To extract information reliably from ambiguous environments, the brain integrates multiple sensory cues, which provide different aspects of information about the same entity of interest. Here, we propose a decentralized architecture for multisensory integration. In such a system, no processor is in the center of the network topology and information integration is achieved in a distributed manner through reciprocally connected local processors. Through studying the inference of heading direction with visual and vestibular cues, we show that the decentralized system
Decentralized Multisensory Information Integration in Neural Systems
Zhang, Wen-hao; Chen, Aihua
2016-01-01
How multiple sensory cues are integrated in neural circuitry remains a challenge. The common hypothesis is that information integration might be accomplished in a dedicated multisensory integration area receiving feedforward inputs from the modalities. However, recent experimental evidence suggests that it is not a single multisensory brain area, but rather many multisensory brain areas that are simultaneously involved in the integration of information. Why many mutually connected areas should be needed for information integration is puzzling. Here, we investigated theoretically how information integration could be achieved in a distributed fashion within a network of interconnected multisensory areas. Using biologically realistic neural network models, we developed a decentralized information integration system that comprises multiple interconnected integration areas. Studying an example of combining visual and vestibular cues to infer heading direction, we show that such a decentralized system is in good agreement with anatomical evidence and experimental observations. In particular, we show that this decentralized system can integrate information optimally. The decentralized system predicts that optimally integrated information should emerge locally from the dynamics of the communication between brain areas and sheds new light on the interpretation of the connectivity between multisensory brain areas. SIGNIFICANCE STATEMENT To extract information reliably from ambiguous environments, the brain integrates multiple sensory cues, which provide different aspects of information about the same entity of interest. Here, we propose a decentralized architecture for multisensory integration. In such a system, no processor is in the center of the network topology and information integration is achieved in a distributed manner through reciprocally connected local processors. Through studying the inference of heading direction with visual and vestibular cues, we show that
It's All About the Data: Workflow Systems and Weather
NASA Astrophysics Data System (ADS)
Plale, B.
2009-05-01
under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
U-Form vs. M-Form: How to Understand Decision Autonomy Under Healthcare Decentralization?
Bustamante, Arturo Vargas
2016-01-01
For more than three decades healthcare decentralization has been promoted in developing countries as a way of improving the financing and delivery of public healthcare. Decision autonomy under healthcare decentralization would determine the role and scope of responsibility of local authorities. Jalal Mohammed, Nicola North, and Toni Ashton analyze decision autonomy within decentralized services in Fiji. They conclude that the narrow decision space allowed to local entities might have limited the benefits of decentralization on users and providers. To discuss the costs and benefits of healthcare decentralization this paper uses the U-form and M-form typology to further illustrate the role of decision autonomy under healthcare decentralization. This paper argues that when evaluating healthcare decentralization, it is important to determine whether the benefits from decentralization are greater than its costs. The U-form and M-form framework is proposed as a useful typology to evaluate different types of institutional arrangements under healthcare decentralization. Under this model, the more decentralized organizational form (M-form) is superior if the benefits from flexibility exceed the costs of duplication and the more centralized organizational form (U-form) is superior if the savings from economies of scale outweigh the costly decision-making process from the center to the regions. Budgetary and financial autonomy and effective mechanisms to maintain local governments accountable for their spending behavior are key decision autonomy variables that could sway the cost-benefit analysis of healthcare decentralization. PMID:27694684
A comparison of decentralized, distributed, and centralized vibro-acoustic control.
Frampton, Kenneth D; Baumann, Oliver N; Gardonio, Paolo
2010-11-01
Direct velocity feedback control of structures is well known to increase structural damping and thus reduce vibration. In multi-channel systems the way in which the velocity signals are used to inform the actuators ranges from decentralized control, through distributed or clustered control to fully centralized control. The objective of distributed controllers is to exploit the anticipated performance advantage of the centralized control while maintaining the scalability, ease of implementation, and robustness of decentralized control. However, and in seeming contradiction, some investigations have concluded that decentralized control performs as well as distributed and centralized control, while other results have indicated that distributed control has significant performance advantages over decentralized control. The purpose of this work is to explain this seeming contradiction in results, to explore the effectiveness of decentralized, distributed, and centralized vibro-acoustic control, and to expand the concept of distributed control to include the distribution of the optimization process and the cost function employed.
After Decentralization: Delimitations and Possibilities within New Fields
ERIC Educational Resources Information Center
Wahlstrom, Ninni
2008-01-01
The shift from a centralized to a decentralized school system can be seen as a solution to an uncertain problem. Through analysing the displacements in the concept of equivalence within Sweden's decentralized school system, this study illustrates how the meaning of the concept of equivalence shifts over time, from a more collective target…
Anokbonggo, W W; Ogwal-Okeng, J W; Ross-Degnan, D; Aupont, O
2004-02-01
In Uganda, the decentralization of administrative functions, management, and responsibility for health care to districts, which began in 1994, resulted in fundamental changes in health care delivery. Since the introduction of the policy in Uganda, little information has been available on stakeholders' perceptions about the benefits of the policy and how decentralization affected health care delivery. To identify the perceptions and beliefs of key stakeholders on the impact and process of decentralization and on the operations of health services in two districts in Uganda, and to report their suggestions to improve future implementation of similar policies. We used qualitative research methods that included focus group discussions with 90 stakeholders from both study districts. The sample population comprised of 12 health workers from the two hospitals, 11 district health administrators, and 67 Local Council Leaders. Perceptions and concerns of stakeholders on the impact of decentralization on district health services. There was a general consensus that decentralization empowered local administrative and political decision-making. Among stakeholders, the policy was perceived to have created a sense of ownership and responsibility. Major problems that were said to be associated with decentralization included political harassment of civil servants, increased nepotism, inadequate financial resources, and mismanagement of resources. This study elicited perceptions about critical factors upon which successful implementation of the decentralization policy depended. These included: appreciation of the role of all stakeholders by district politicians; adequate availability and efficient utilization of resources; reasonably developed infrastructure prior to the policy change; appropriate sensitisation and training of those implementing policies; and the good will and active involvement of the local community. In the absence of these factors, implementation of
On decentralized design: Rationale, dynamics, and effects on decision-making
NASA Astrophysics Data System (ADS)
Chanron, Vincent
The focus of this dissertation is the design of complex systems, including engineering systems such as cars, airplanes, and satellites. Companies who design these systems are under constant pressure to design better products that meet customer expectations, and competition forces them to develop them faster. One of the responses of the industry to these conflicting challenges has been the decentralization of the design responsibilities. The current lack of understanding of the dynamics of decentralized design processes is the main motivation for this research, and places value on the descriptive base. It identifies the main reasons and the true benefits for companies to decentralize the design of their products. It also demonstrates the limitations of this approach by listing the relevant issues and problems created by the decentralization of decisions. Based on these observations, a game-theoretic approach to decentralized design is proposed to model the decisions made during the design process. The dynamics are modeled using mathematical formulations inspired from control theory. Building upon this formalism, the issue of convergence in decentralized design is analyzed: the equilibrium points of the design space are identified and convergent and divergent patterns are recognized. This rigorous investigation of the design process provides motivation and support for proposing new approaches to decentralized design problems. Two methods are developed, which aim at improving the design process in two ways: decreasing the product development time, and increasing the optimality of the final design. The frame of these methods are inspired by eigenstructure decomposition and set-based design, respectively. The value of the research detailed within this dissertation is in the proposed methods which are built upon the sound mathematical formalism developed. The contribution of this work is two fold: rigorous investigation of the design process, and practical support to
Decentralized control of large flexible structures by joint decoupling
NASA Technical Reports Server (NTRS)
Su, Tzu-Jeng; Juang, Jer-Nan
1994-01-01
This paper presents a novel method to design decentralized controllers for large complex flexible structures by using the idea of joint decoupling. Decoupling of joint degrees of freedom from the interior degrees of freedom is achieved by setting the joint actuator commands to cancel the internal forces exerting on the joint degrees of freedom. By doing so, the interactions between substructures are eliminated. The global structure control design problem is then decomposed into several substructure control design problems. Control commands for interior actuators are set to be localized state feedback using decentralized observers for state estimation. The proposed decentralized controllers can operate successfully at the individual substructure level as well as at the global structure level. Not only control design but also control implementation is decentralized. A two-component mass-spring-damper system is used as an example to demonstrate the proposed method.
Formation Flying With Decentralized Control in Libration Point Orbits
NASA Technical Reports Server (NTRS)
Folta, David; Carpenter, J. Russell; Wagner, Christoph
2000-01-01
A decentralized control framework is investigated for applicability of formation flying control in libration orbits. The decentralized approach, being non-hierarchical, processes only direct measurement data, in parallel with the other spacecraft. Control is accomplished via linearization about a reference libration orbit with standard control using a Linear Quadratic Regulator (LQR) or the GSFC control algorithm. Both are linearized about the current state estimate as with the extended Kalman filter. Based on this preliminary work, the decentralized approach appears to be feasible for upcoming libration missions using distributed spacecraft.
Routine Digital Pathology Workflow: The Catania Experience
Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana
2017-01-01
Introduction: Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory. PMID:29416914
Struchiner, Miriam; Roschke, Maria Alice; Ricciardi, Regina Maria Vieira
2002-03-01
This paper describes the Course on Decentralized Management of Human Resources in Health Care, which is an Internet-based distance learning program to train and provide continuing education for health care professionals. The program is an initiative of the Pan American Health Organization, and it was organized in response to the growing need for self-reliant professionals who can constantly upgrade their knowledge without having to leave their place of work. The proposed model promotes an educational process that brings together theory and practice in realistic and relevant contexts and that maximizes the participation of students, both individually and in groups. The program has been evaluated in pilot studies in Brazil, Chile, and Peru. Following these assessments, the course has been adapted to facilitate its implementation and to adjust its contents to fit each country's circumstances.
Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J
2015-01-01
Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169
The equivalency between logic Petri workflow nets and workflow nets.
Wang, Jing; Yu, ShuXia; Du, YuYue
2015-01-01
Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.
The Equivalency between Logic Petri Workflow Nets and Workflow Nets
Wang, Jing; Yu, ShuXia; Du, YuYue
2015-01-01
Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845
Provenance-Powered Automatic Workflow Generation and Composition
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.
2015-12-01
In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.
The impact of using an intravenous workflow management system (IVWMS) on cost and patient safety.
Lin, Alex C; Deng, Yihong; Thaibah, Hilal; Hingl, John; Penm, Jonathan; Ivey, Marianne F; Thomas, Mark
2018-07-01
The aim of this study was to determine the financial costs associated with wasted and missing doses before and after the implementation of an intravenous workflow management system (IVWMS) and to quantify the number and the rate of detected intravenous (IV) preparation errors. A retrospective analysis of the sample hospital information system database was conducted using three months of data before and after the implementation of an IVWMS System (DoseEdge ® ) which uses barcode scanning and photographic technologies to track and verify each step of the preparation process. The financial impact associated with wasted and missing >IV doses was determined by combining drug acquisition, labor, accessory, and disposal costs. The intercepted error reports and pharmacist detected error reports were drawn from the IVWMS to quantify the number of errors by defined error categories. The total number of IV doses prepared before and after the implementation of the IVWMS system were 110,963 and 101,765 doses, respectively. The adoption of the IVWMS significantly reduced the amount of wasted and missing IV doses by 14,176 and 2268 doses, respectively (p < 0.001). The overall cost savings of using the system was $144,019 over 3 months. The total number of errors detected was 1160 (1.14%) after using the IVWMS. The implementation of the IVWMS facilitated workflow changes that led to a positive impact on cost and patient safety. The implementation of the IVWMS increased patient safety by enforcing standard operating procedures and bar code verifications. Published by Elsevier B.V.
Structured recording of intraoperative surgical workflows
NASA Astrophysics Data System (ADS)
Neumuth, T.; Durstewitz, N.; Fischer, M.; Strauss, G.; Dietz, A.; Meixensberger, J.; Jannin, P.; Cleary, K.; Lemke, H. U.; Burgert, O.
2006-03-01
Surgical Workflows are used for the methodical and scientific analysis of surgical interventions. The approach described here is a step towards developing surgical assist systems based on Surgical Workflows and integrated control systems for the operating room of the future. This paper describes concepts and technologies for the acquisition of Surgical Workflows by monitoring surgical interventions and their presentation. Establishing systems which support the Surgical Workflow in operating rooms requires a multi-staged development process beginning with the description of these workflows. A formalized description of surgical interventions is needed to create a Surgical Workflow. This description can be used to analyze and evaluate surgical interventions in detail. We discuss the subdivision of surgical interventions into work steps regarding different levels of granularity and propose a recording scheme for the acquisition of manual surgical work steps from running interventions. To support the recording process during the intervention, we introduce a new software architecture. Core of the architecture is our Surgical Workflow editor that is intended to deal with the manifold, complex and concurrent relations during an intervention. Furthermore, a method for an automatic generation of graphs is shown which is able to display the recorded surgical work steps of the interventions. Finally we conclude with considerations about extensions of our recording scheme to close the gap to S-PACS systems. The approach was used to record 83 surgical interventions from 6 intervention types from 3 different surgical disciplines: ENT surgery, neurosurgery and interventional radiology. The interventions were recorded at the University Hospital Leipzig, Germany and at the Georgetown University Hospital, Washington, D.C., USA.
Modeling Complex Workflow in Molecular Diagnostics
Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan
2010-01-01
One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844
Centralization vs. Decentralization: A Location Analysis Approach for Librarians
ERIC Educational Resources Information Center
Raffel, Jeffrey; Shishko, Robert
1972-01-01
An application of location theory to the question of centralized versus decentralized library facilities for a university, with relevance for special libraries is presented. The analysis provides models for a single library, for two or more libraries, or for decentralized facilities. (6 references) (Author/NH)
The Rhetoric of Decentralization
ERIC Educational Resources Information Center
Ravitch, Diane
1974-01-01
Questions the rationale for and possible consequences of political decentralization of New York City. Suggests that the disadvantages--reduced level of professionalism, increased expense in multiple government operation, "stabilization" of residential segregation, necessity for budget negotiations because of public disclosure of tax…
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
Federalism and decentralization: impact on international and Brazilian health policies.
Leite, Valéria Rodrigues; de Vasconcelos, Cipriano Maia; Lima, Kenio Costa
2011-01-01
This article discusses the implications of decentralization in the light of international and Brazilian federalism, and its effects on public health policy. In a comparative analysis among countries, the authors find there is no single model; rather, each country has a unique structure of institutions and norms that have important implications for the operation of its health system. Brazil shares some similarities with other countries that have adopted a decentralized system and is assuming features ever closer to U.S. federalism, with a complex web of relationships. The degree of inequality among Brazilian municipalities and states, along with the budgetary imbalances caused by the minimal levels of resource utilization, undermines Brazil's constitutional principles and, consequently, its federalism. To ensure the constitutional mandate in Brazil, it is essential, as in other countries, to create a stable source of funds and increase the volume and efficiency of spending. Also important are investing in the training of managers, improving information systems, strengthening the principles of autonomy and interdependence, and defining patterns of cooperation within the federation.
Decentralized Interleaving of Paralleled Dc-Dc Buck Converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B; Rodriguez, Miguel; Sinha, Mohit
We present a decentralized control strategy that yields switch interleaving among parallel-connected dc-dc buck converters. The proposed method is based on the digital implementation of the dynamics of a nonlinear oscillator circuit as the controller. Each controller is fully decentralized, i.e., it only requires the locally measured output current to synthesize the pulse width modulation (PWM) carrier waveform and no communication between different controllers is needed. By virtue of the intrinsic electrical coupling between converters, the nonlinear oscillator-based controllers converge to an interleaved state with uniform phase-spacing across PWM carriers. To the knowledge of the authors, this work presents themore » first fully decentralized strategy for switch interleaving in paralleled dc-dc buck converters.« less
Decentralized Bayesian search using approximate dynamic programming methods.
Zhao, Yijia; Patek, Stephen D; Beling, Peter A
2008-08-01
We consider decentralized Bayesian search problems that involve a team of multiple autonomous agents searching for targets on a network of search points operating under the following constraints: 1) interagent communication is limited; 2) the agents do not have the opportunity to agree in advance on how to resolve equivalent but incompatible strategies; and 3) each agent lacks the ability to control or predict with certainty the actions of the other agents. We formulate the multiagent search-path-planning problem as a decentralized optimal control problem and introduce approximate dynamic heuristics that can be implemented in a decentralized fashion. After establishing some analytical properties of the heuristics, we present computational results for a search problem involving two agents on a 5 x 5 grid.
Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Tarighati, Alla; Gross, James; Jalden, Joakim
2017-09-01
We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.
The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)
NASA Astrophysics Data System (ADS)
Stephan, E.
2010-12-01
The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual
Taming instabilities in power grid networks by decentralized control
NASA Astrophysics Data System (ADS)
Schäfer, B.; Grabow, C.; Auer, S.; Kurths, J.; Witthaut, D.; Timme, M.
2016-05-01
Renewables will soon dominate energy production in our electric power system. And yet, how to integrate renewable energy into the grid and the market is still a subject of major debate. Decentral Smart Grid Control (DSGC) was recently proposed as a robust and decentralized approach to balance supply and demand and to guarantee a grid operation that is both economically and dynamically feasible. Here, we analyze the impact of network topology by assessing the stability of essential network motifs using both linear stability analysis and basin volume for delay systems. Our results indicate that if frequency measurements are averaged over sufficiently large time intervals, DSGC enhances the stability of extended power grid systems. We further investigate whether DSGC supports centralized and/or decentralized power production and find it to be applicable to both. However, our results on cycle-like systems suggest that DSGC favors systems with decentralized production. Here, lower line capacities and lower averaging times are required compared to those with centralized production.
Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation
Campbell, Robert James; Gantt, Laura; Congdon, Tamara
2009-01-01
This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533
Workflow Optimization in Vertebrobasilar Occlusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamper, Lars, E-mail: lars.kamper@helios-kliniken.de; Meyn, Hannes; Rybacki, Konrad
2012-06-15
Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients' outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admissionmore » until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.« less
The political economy of decentralization of health and social services in Canada.
Tsalikis, G
1989-01-01
A trend to decentralization in Canada's 'welfare state' has received support from the Left and from the Right. Some social critics of the Left expect decentralization to result in holistic services adjusted to local needs. Others, moreover, feel we are in the dawn of a new epoch in which major economic transformations are to bring about, through new class alliances and conflict, decentralization of power and a better quality of life in communities. These assumptions and their theoretical pitfalls are discussed here following an historical overview of the centralization/decentralization issue in Canadian social policy. It is argued that recent proposals of decentralization are a continuation of reactionary tendencies to constrain social expenditures, but not a path to better quality of life.
Ho, Jonhan; Aridor, Orly; Parwani, Anil V.
2012-01-01
Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users’ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists’ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number of processes that
Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support
2012-01-01
Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system
Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.
Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa
2012-05-04
Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a
Using aerial images for establishing a workflow for the quantification of water management measures
NASA Astrophysics Data System (ADS)
Leuschner, Annette; Merz, Christoph; van Gasselt, Stephan; Steidl, Jörg
2017-04-01
Quantified landscape characteristics, such as morphology, land use or hydrological conditions, play an important role for hydrological investigations as landscape parameters directly control the overall water balance. A powerful assimilation and geospatial analysis of remote sensing datasets in combination with hydrological modeling allows to quantify landscape parameters and water balances efficiently. This study focuses on the development of a workflow to extract hydrologically relevant data from aerial image datasets and derived products in order to allow an effective parametrization of a hydrological model. Consistent and self-contained data source are indispensable for achieving reasonable modeling results. In order to minimize uncertainties and inconsistencies, input parameters for modeling should be extracted from one remote-sensing dataset mainly if possbile. Here, aerial images have been chosen because of their high spatial and spectral resolution that permits the extraction of various model relevant parameters, like morphology, land-use or artificial drainage-systems. The methodological repertoire to extract environmental parameters range from analyses of digital terrain models, multispectral classification and segmentation of land use distribution maps and mapping of artificial drainage-systems based on spectral and visual inspection. The workflow has been tested for a mesoscale catchment area which forms a characteristic hydrological system of a young moraine landscape located in the state of Brandenburg, Germany. These dataset were used as input-dataset for multi-temporal hydrological modelling of water balances to detect and quantify anthropogenic and meteorological impacts. ArcSWAT, as a GIS-implemented extension and graphical user input interface for the Soil Water Assessment Tool (SWAT) was chosen. The results of this modeling approach provide the basis for anticipating future development of the hydrological system, and regarding system changes for
Centralization Versus Decentralization: A Location Analysis Approach for Librarians.
ERIC Educational Resources Information Center
Shishko, Robert; Raffel, Jeffrey
One of the questions that seems to perplex many university and special librarians is whether to move in the direction of centralizing or decentralizing the library's collections and facilities. Presented is a theoretical approach, employing location theory, to the library centralization-decentralization question. Location theory allows the analyst…
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...
2016-07-21
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Survey of decentralized control methods. [for large scale dynamic systems
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.
On Deciding How to Decide: To Centralize or Decentralize.
ERIC Educational Resources Information Center
Chaffee, Ellen Earle
Issues concerning whether to centralize or decentralize decision-making are addressed, with applications for colleges. Centralization/decentralization (C/D) must be analyzed with reference to a particular decision. Three components of C/D are locus of authority, breadth of participation, and relative contribution by the decision-maker's staff. C/D…
Decentralization and equity of resource allocation: evidence from Colombia and Chile.
Bossert, Thomas J.; Larrañaga, Osvaldo; Giedion, Ursula; Arbelaez, José Jesus; Bowser, Diana M.
2003-01-01
OBJECTIVE: To investigate the relation between decentralization and equity of resource allocation in Colombia and Chile. METHODS: The "decision space" approach and analysis of expenditures and utilization rates were used to provide a comparative analysis of decentralization of the health systems of Colombia and Chile. FINDINGS: Evidence from Colombia and Chile suggests that decentralization, under certain conditions and with some specific policy mechanisms, can improve equity of resource allocation. In these countries, equitable levels of per capita financial allocations at the municipal level were achieved through different forms of decentralization--the use of allocation formulae, adequate local funding choices and horizontal equity funds. Findings on equity of utilization of services were less consistent, but they did show that increased levels of funding were associated with increased utilization. This suggests that improved equity of funding over time might reduce inequities of service utilization. CONCLUSION: Decentralization can contribute to, or at least maintain, equitable allocation of health resources among municipalities of different incomes. PMID:12751417
Machado, A I; Beretta, M; Fragoso, R; Duarte, E
2017-02-01
Conventional wastewater treatment plants (WWTPs) commonly require large capital investments as well as operation and maintenance costs. Constructed wetlands (CWs) appear as a cost-effective treatment, since they can remove a broad range of contaminants by a combination of physical, chemical and biological processes with a low cost. Therefore, CWs can be successfully applied for decentralized wastewater treatment in regions with low population density and/or with large land availability as Brazil. The present work provides a review of thirty nine studies developed on CWs implemented in Brazil to remove wastewater contaminants. Brazil current sanitation data is also considered to evaluate the potential role of CWs as decentralized wastewater treatment. Performance of CWs was evaluated according to (i) type of wetland system, (ii) different support matrix (iii) vegetation species and (iv) removal efficiency of chemical oxygen demand (COD), biological oxygen demand (BOD 5 ), nitrogen (N), and phosphorus (P). The reviewed CWs in overall presented good efficiencies, whereas H-CWs achieved the highest removals for P, while the higher results for N were attained on VF-CW and for COD and BOD 5 on HF-CW. Therefore, was concluded that CWs are an interesting solution for decentralized wastewater treatment in Brazil since it has warm temperatures, extensive radiation hours and available land. Additionally, the low percentage of population with access to the sewage network in the North and Northeast regions makes these systems especially suitable. Hence, the further implementation of CW is encouraged by the authors in regions with similar characteristics as Brazil. Copyright © 2016 Elsevier Ltd. All rights reserved.
Leveraging workflow control patterns in the domain of clinical practice guidelines.
Kaiser, Katharina; Marcos, Mar
2016-02-10
Clinical practice guidelines (CPGs) include recommendations describing appropriate care for the management of patients with a specific clinical condition. A number of representation languages have been developed to support executable CPGs, with associated authoring/editing tools. Even with tool assistance, authoring of CPG models is a labor-intensive task. We aim at facilitating the early stages of CPG modeling task. In this context, we propose to support the authoring of CPG models based on a set of suitable procedural patterns described in an implementation-independent notation that can be then semi-automatically transformed into one of the alternative executable CPG languages. We have started with the workflow control patterns which have been identified in the fields of workflow systems and business process management. We have analyzed the suitability of these patterns by means of a qualitative analysis of CPG texts. Following our analysis we have implemented a selection of workflow patterns in the Asbru and PROforma CPG languages. As implementation-independent notation for the description of patterns we have chosen BPMN 2.0. Finally, we have developed XSLT transformations to convert the BPMN 2.0 version of the patterns into the Asbru and PROforma languages. We showed that although a significant number of workflow control patterns are suitable to describe CPG procedural knowledge, not all of them are applicable in the context of CPGs due to their focus on single-patient care. Moreover, CPGs may require additional patterns not included in the set of workflow control patterns. We also showed that nearly all the CPG-suitable patterns can be conveniently implemented in the Asbru and PROforma languages. Finally, we demonstrated that individual patterns can be semi-automatically transformed from a process specification in BPMN 2.0 to executable implementations in these languages. We propose a pattern and transformation-based approach for the development of CPG models
Camfield, Carol S; Joseph, Marissa; Hurley, Teresa; Campbell, Karen; Sanderson, Susan; Camfield, Peter R
2004-07-01
To compare phenylketonuria (PKU) management by a centralized, expert team in the Province of Nova Scotia (NS) with the decentralized approach in New Brunswick (NB). Retrospective chart review documented frequency of outpatient visits, phenylalanine (Phe) concentration, and medical formula use. Structured telephone interviews with the 8 regional NB dietitians (NB-D) documented their knowledge and support in PKU management. Patients with PKU (n=108; age, birth to 42 years) reside in NB (n=69) and NS (n=39). More were lost to contact in NB than in NS (9/69 vs 1/39) and more were completely off diet in NB than in NS (24/60 vs 1/38, P=.05). All 15 children <2 years old followed by a PKU team in either NS or Saint John, NB had optimal Phe levels. Children 2 to 12 years of age in NS had better Phe control and more medical visits than in NB (P <.01). Older patients had more episodes of elevated Phe levels (P=.01). Formula was dispensed in appropriate yearly amounts to 52% in NB and >95% in NS. Mental handicap or borderline intelligence was common in both NB (44%) and NS (42%). All NB-D wished additional specialized medical, nursing, or social work assistance. PKU management appears to be more effective with an expert, coordinated team approach.
Widening the adoption of workflows to include human and human-machine scientific processes
NASA Astrophysics Data System (ADS)
Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.
2010-12-01
Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.
Integrated workflows for spiking neuronal network simulations
Antolík, Ján; Davison, Andrew P.
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID
Integrated workflows for spiking neuronal network simulations.
Antolík, Ján; Davison, Andrew P
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.
Decentralization in Education: Technical Demands as a Critical Ingredient.
ERIC Educational Resources Information Center
Hannaway, Jane
The implications of decentralization reform on the amount of serious attention and effort that teachers give to teaching and learning activities are explored in this paper. The discussion is informed by the results of two case studies of school districts recognized as exemplary cases of decentralization. The first section describes limitations of…
Shabo, Amnon; Peleg, Mor; Parimbelli, Enea; Quaglini, Silvana; Napolitano, Carlo
2016-12-07
Implementing a decision-support system within a healthcare organization requires integration of clinical domain knowledge with resource constraints. Computer-interpretable guidelines (CIG) are excellent instruments for addressing clinical aspects while business process management (BPM) languages and Workflow (Wf) engines manage the logistic organizational constraints. Our objective is the orchestration of all the relevant factors needed for a successful execution of patient's care pathways, especially when spanning the continuum of care, from acute to community or home care. We considered three strategies for integrating CIGs with organizational workflows: extending the CIG or BPM languages and their engines, or creating an interplay between them. We used the interplay approach to implement a set of use cases arising from a CIG implementation in the domain of Atrial Fibrillation. To provide a more scalable and standards-based solution, we explored the use of Cross-Enterprise Document Workflow Integration Profile. We describe our proof-of-concept implementation of five use cases. We utilized the Personal Health Record of the MobiGuide project to implement a loosely-coupled approach between the Activiti BPM engine and the Picard CIG engine. Changes in the PHR were detected by polling. IHE profiles were used to develop workflow documents that orchestrate cross-enterprise execution of cardioversion. Interplay between CIG and BPM engines can support orchestration of care flows within organizational settings.
Optical performance of toric intraocular lenses in the presence of decentration.
Zhang, Bin; Ma, Jin-Xue; Liu, Dan-Yan; Du, Ying-Hua; Guo, Cong-Rong; Cui, Yue-Xian
2015-01-01
To evaluate the optical performance of toric intraocular lenses (IOLs) after decentration and with different pupil diameters, but with the IOL astigmatic axis aligned. Optical performances of toric T5 and SN60AT spherical IOLs after decentration were tested on a theoretical pseudophakic model eye based on the Hwey-Lan Liou schematic eye using the Zemax ray-tracing program. Changes in optical performance were analyzed in model eyes with 3-mm, 4-mm, and 5-mm pupil diameters and decentered from 0.25 mm to 0.75 mm with an interval of 5° at the meridian direction from 0° to 90°. The ratio of the modulation transfer function (MTF) between a decentered and a centered IOL (MTFDecentration/MTFCentration) was calculated to analyze the decrease in optical performance. Optical performance of the toric IOL remained unchanged when IOLs were decentered in any meridian direction. The MTFs of the two IOLs decreased, whereas optical performance remained equivalent after decentration. The MTFDecentration/MTFCentration ratios of the IOLs at a decentration from 0.25 mm to 0.75 mm were comparable in the toric and SN60AT IOLs. After decentration, MTF decreased further, with the MTF of the toric IOL being slightly lower than that of the SN60AT IOL. Imaging qualities of the two IOLs decreased when the pupil diameter and the degree of decentration increased, but the decrease was similar in the toric and spherical IOLs. Toric IOLs were comparable to spherical IOLs in terms of tolerance to decentration at the correct axial position.
Menon, Sonia S; Rossi, Rodolfo; Nshimyumukiza, Leon; Zinszer, Kate
2016-01-01
Human migration and concomitant HIV infections are likely to bring about major changes in the epidemiology of some parasitic infections in Brazil. Human visceral leishmaniasis (HVL) control is particularly fraught with intricacies. It is against a backdrop of decentralized health care that the complex HVL control initiatives are brought to bear. This comprehensive review aims to explore the obstacles facing decentralized HVL control in urban endemic areas in Brazil. A literature search was carried out in December 2015 by means of three databases: MEDLINE, Google Scholar, and Web of Science. Although there have been many strides that have been made in elucidating the eco-epidemiology of Leishmania infantum, which forms the underpinnings of the national control program, transmission risk factors for HVL are still insufficiently elucidated in urban settings. Decentralized HVL epidemiological surveillance and control for animal reservoirs and vectors may compromise sustainability. In addition, it may hamper timely human HVL case management. With the burgeoning of the HIV-HVL co-infection, the potential human transmission may be underestimated. HVL is a disease with focal transmission at a critical juncture, which warrants that the bottlenecks facing the control program within contexts of decentralized healthcare systems be taken into account. In addition, HIV-driven HVL epidemics may substantially increase the transmission potential of the human reservoir. Calculating the basic reproductive number to fine-tune interventions will have to take into consideration the specific socio-economic development context.
Novak, Laurie L; Johnson, Kevin B; Lorenzi, Nancy M
2010-01-01
The objective of this review was to describe methods used to study and model workflow. The authors included studies set in a variety of industries using qualitative, quantitative and mixed methods. Of the 6221 matching abstracts, 127 articles were included in the final corpus. The authors collected data from each article on researcher perspective, study type, methods type, specific methods, approaches to evaluating quality of results, definition of workflow and dependent variables. Ethnographic observation and interviews were the most frequently used methods. Long study durations revealed the large time commitment required for descriptive workflow research. The most frequently discussed technique for evaluating quality of study results was triangulation. The definition of the term “workflow” and choice of methods for studying workflow varied widely across research areas and researcher perspectives. The authors developed a conceptual framework of workflow-related terminology for use in future research and present this model for use by other researchers. PMID:20442143
NASA Astrophysics Data System (ADS)
Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut
2017-04-01
Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high
Decentralization, stabilization, and estimation of large-scale linear systems
NASA Technical Reports Server (NTRS)
Siljak, D. D.; Vukcevic, M. B.
1976-01-01
In this short paper we consider three closely related aspects of large-scale systems: decentralization, stabilization, and estimation. A method is proposed to decompose a large linear system into a number of interconnected subsystems with decentralized (scalar) inputs or outputs. The procedure is preliminary to the hierarchic stabilization and estimation of linear systems and is performed on the subsystem level. A multilevel control scheme based upon the decomposition-aggregation method is developed for stabilization of input-decentralized linear systems Local linear feedback controllers are used to stabilize each decoupled subsystem, while global linear feedback controllers are utilized to minimize the coupling effect among the subsystems. Systems stabilized by the method have a tolerance to a wide class of nonlinearities in subsystem coupling and high reliability with respect to structural perturbations. The proposed output-decentralization and stabilization schemes can be used directly to construct asymptotic state estimators for large linear systems on the subsystem level. The problem of dimensionality is resolved by constructing a number of low-order estimators, thus avoiding a design of a single estimator for the overall system.
Decentralized Formation Flying Control in a Multiple-Team Hierarchy
NASA Technical Reports Server (NTRS)
Mueller, Joseph .; Thomas, Stephanie J.
2005-01-01
This paper presents the prototype of a system that addresses these objectives-a decentralized guidance and control system that is distributed across spacecraft using a multiple-team framework. The objective is to divide large clusters into teams of manageable size, so that the communication and computational demands driven by N decentralized units are related to the number of satellites in a team rather than the entire cluster. The system is designed to provide a high-level of autonomy, to support clusters with large numbers of satellites, to enable the number of spacecraft in the cluster to change post-launch, and to provide for on-orbit software modification. The distributed guidance and control system will be implemented in an object-oriented style using MANTA (Messaging Architecture for Networking and Threaded Applications). In this architecture, tasks may be remotely added, removed or replaced post-launch to increase mission flexibility and robustness. This built-in adaptability will allow software modifications to be made on-orbit in a robust manner. The prototype system, which is implemented in MATLAB, emulates the object-oriented and message-passing features of the MANTA software. In this paper, the multiple-team organization of the cluster is described, and the modular software architecture is presented. The relative dynamics in eccentric reference orbits is reviewed, and families of periodic, relative trajectories are identified, expressed as sets of static geometric parameters. The guidance law design is presented, and an example reconfiguration scenario is used to illustrate the distributed process of assigning geometric goals to the cluster. Next, a decentralized maneuver planning approach is presented that utilizes linear-programming methods to enact reconfiguration and coarse formation keeping maneuvers. Finally, a method for performing online collision avoidance is discussed, and an example is provided to gauge its performance.
Chao, Tian-Jy; Kim, Younghun
2015-02-10
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.
Knowledge Annotations in Scientific Workflows: An Implementation in Kepler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gandara, Aida G.; Chin, George; Pinheiro Da Silva, Paulo
2011-07-20
Abstract. Scientic research products are the result of long-term collaborations between teams. Scientic workfows are capable of helping scientists in many ways including the collection of information as to howresearch was conducted, e.g. scientic workfow tools often collect and manage information about datasets used and data transformations. However,knowledge about why data was collected is rarely documented in scientic workflows. In this paper we describe a prototype system built to support the collection of scientic expertise that infuences scientic analysis. Through evaluating a scientic research eort underway at Pacific Northwest National Laboratory, we identied features that would most benefit PNNL scientistsmore » in documenting how and why they conduct their research making this information available to the entire team. The prototype system was built by enhancing the Kepler Scientic Work-flow System to create knowledge-annotated scientic workfows and topublish them as semantic annotations.« less
Decentralized Interleaving of Paralleled Dc-Dc Buck Converters: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B; Rodriguez, Miguel; Sinha, Mohit
We present a decentralized control strategy that yields switch interleaving among parallel connected dc-dc buck converters without communication. The proposed method is based on the digital implementation of the dynamics of a nonlinear oscillator circuit as the controller. Each controller is fully decentralized, i.e., it only requires the locally measured output current to synthesize the pulse width modulation (PWM) carrier waveform. By virtue of the intrinsic electrical coupling between converters, the nonlinear oscillator-based controllers converge to an interleaved state with uniform phase-spacing across PWM carriers. To the knowledge of the authors, this work represents the first fully decentralized strategy formore » switch interleaving of paralleled dc-dc buck converters.« less
Optimization of business processes in banks through flexible workflow
NASA Astrophysics Data System (ADS)
Postolache, V.
2017-08-01
This article describes an integrated business model of a commercial bank. There are examples of components that go into its composition: wooden models and business processes, strategic goals, organizational structure, system architecture, operational and marketing risk models, etc. The practice has shown that the development and implementation of the integrated business model of the bank significantly increase operating efficiency and its management, ensures organizational and technology stable development. Considering the evolution of business processes in the banking sector, should be analysed their common characteristics. From the author’s point of view, a business process is a set of various activities of a commercial bank in which “Input” is one or more financial and material resources, as a result of this activity and “output” is created by banking product, which is some value to consumer. Using workflow technology, management business process efficiency issue is a matter of managing the integration of resources and sequence of actions aimed at achieving this goal. In turn, it implies management of jobs or functions’ interaction, synchronizing of the assignments periods, reducing delays in the transmission of the results etc. Workflow technology is very important for managers at all levels, as they can use it to easily strengthen the control over what is happening in a particular unit, and in the bank as a whole. The manager is able to plan, to implement rules, to interact within the framework of the company’s procedures and tasks entrusted to the system of the distribution function and execution control, alert on the implementation and issuance of the statistical data on the effectiveness of operating procedures. Development and active use of the integrated bank business model is one of the key success factors that contribute to long-term and stable development of the bank, increase employee efficiency and business processes, implement the
Decentralization can help reduce deforestation when user groups engage with local government.
Wright, Glenn D; Andersson, Krister P; Gibson, Clark C; Evans, Tom P
2016-12-27
Policy makers around the world tout decentralization as an effective tool in the governance of natural resources. Despite the popularity of these reforms, there is limited scientific evidence on the environmental effects of decentralization, especially in tropical biomes. This study presents evidence on the institutional conditions under which decentralization is likely to be successful in sustaining forests. We draw on common-pool resource theory to argue that the environmental impact of decentralization hinges on the ability of reforms to engage local forest users in the governance of forests. Using matching techniques, we analyze longitudinal field observations on both social and biophysical characteristics in a large number of local government territories in Bolivia (a country with a decentralized forestry policy) and Peru (a country with a much more centralized forestry policy). We find that territories with a decentralized forest governance structure have more stable forest cover, but only when local forest user groups actively engage with the local government officials. We provide evidence in support of a possible causal process behind these results: When user groups engage with the decentralized units, it creates a more enabling environment for effective local governance of forests, including more local government-led forest governance activities, fora for the resolution of forest-related conflicts, intermunicipal cooperation in the forestry sector, and stronger technical capabilities of the local government staff.
Decentralization can help reduce deforestation when user groups engage with local government
Wright, Glenn D.; Gibson, Clark C.; Evans, Tom P.
2016-01-01
Policy makers around the world tout decentralization as an effective tool in the governance of natural resources. Despite the popularity of these reforms, there is limited scientific evidence on the environmental effects of decentralization, especially in tropical biomes. This study presents evidence on the institutional conditions under which decentralization is likely to be successful in sustaining forests. We draw on common-pool resource theory to argue that the environmental impact of decentralization hinges on the ability of reforms to engage local forest users in the governance of forests. Using matching techniques, we analyze longitudinal field observations on both social and biophysical characteristics in a large number of local government territories in Bolivia (a country with a decentralized forestry policy) and Peru (a country with a much more centralized forestry policy). We find that territories with a decentralized forest governance structure have more stable forest cover, but only when local forest user groups actively engage with the local government officials. We provide evidence in support of a possible causal process behind these results: When user groups engage with the decentralized units, it creates a more enabling environment for effective local governance of forests, including more local government-led forest governance activities, fora for the resolution of forest-related conflicts, intermunicipal cooperation in the forestry sector, and stronger technical capabilities of the local government staff. PMID:27956644
On decentralized control of large-scale systems
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1978-01-01
A scheme is presented for decentralized control of large-scale linear systems which are composed of a number of interconnected subsystems. By ignoring the interconnections, local feedback controls are chosen to optimize each decoupled subsystem. Conditions are provided to establish compatibility of the individual local controllers and achieve stability of the overall system. Besides computational simplifications, the scheme is attractive because of its structural features and the fact that it produces a robust decentralized regulator for large dynamic systems, which can tolerate a wide range of nonlinearities and perturbations among the subsystems.
Decentralized digital adaptive control of robot motion
NASA Technical Reports Server (NTRS)
Tarokh, M.
1990-01-01
A decentralized model reference adaptive scheme is developed for digital control of robot manipulators. The adaptation laws are derived using hyperstability theory, which guarantees asymptotic trajectory tracking despite gross robot parameter variations. The control scheme has a decentralized structure in the sense that each local controller receives only its joint angle measurement to produce its joint torque. The independent joint controllers have simple structures and can be programmed using a very simple and computationally fast algorithm. As a result, the scheme is suitable for real-time motion control.
Two controller design approaches for decentralized systems
NASA Technical Reports Server (NTRS)
Ozguner, U.; Khorrami, F.; Iftar, A.
1988-01-01
Two different philosophies for designing the controllers of decentralized systems are considered within a quadratic regulator framework which is generalized to admit decentralized frequency weighting. In the first approach, the total system model is examined, and the feedback strategy for each channel or subsystem is determined. In the second approach, separate, possibly overlapping, and uncoupled models are analyzed for each channel, and the results can be combined to study the original system. The two methods are applied to the example of a model of the NASA COFS Mast Flight System.
Decentralization of health care systems and health outcomes: Evidence from a natural experiment.
Jiménez-Rubio, Dolores; García-Gómez, Pilar
2017-09-01
While many countries worldwide are shifting responsibilities for their health systems to local levels of government, there is to date insufficient evidence about the potential impact of these policy reforms. We estimate the impact of decentralization of the health services on infant and neonatal mortality using a natural experiment: the devolution of health care decision making powers to Spanish regions. The devolution was implemented gradually and asymmetrically over a twenty-year period (1981-2002). The order in which the regions were decentralized was driven by political factors and hence can be considered exogenous to health outcomes. In addition, we exploit the dynamic effect of decentralization of health services and allow for heterogeneous effects by the two main types of decentralization implemented across regions: full decentralization (political and fiscal powers) versus political decentralization only. Our difference in differences results based on a panel dataset for the 50 Spanish provinces over the period 1980 to 2010 show that the lasting benefit of decentralization accrues only to regions which enjoy almost full fiscal and political powers and which are also among the richest regions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will
2016-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.
NASA Technical Reports Server (NTRS)
Jones, Corey; Kapatos, Dennis; Skradski, Cory
2012-01-01
Do you have workflows with many manual tasks that slow down your business? Or, do you scale back workflows because there are simply too many manual tasks? Basic workflow robots can automate some common tasks, but not everything. This presentation will show how advanced robots called "expression robots" can be set up to perform everything from simple tasks such as: moving, creating folders, renaming, changing or creating an attribute, and revising, to more complex tasks like: creating a pdf, or even launching a session of Creo Parametric and performing a specific modeling task. Expression robots are able to utilize the Java API and Info*Engine to do almost anything you can imagine! Best of all, these tools are supported by PTC and will work with later releases of Windchill. Limited knowledge of Java, Info*Engine, and XML are required. The attendee will learn what task expression robots are capable of performing. The attendee will learn what is involved in setting up an expression robot. The attendee will gain a basic understanding of simple Info*Engine tasks
Research on a dynamic workflow access control model
NASA Astrophysics Data System (ADS)
Liu, Yiliang; Deng, Jinxia
2007-12-01
In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.
The Impact of Human-Automation Collaboration in Decentralized Multiple Unmanned Vehicle Control
2011-01-01
based decentralized auctions for robust task allocation ,[ IEEE Trans. Robot., vol. 25, no. 4, pp...operators can aid such systems by bringing their knowledge- based reasoning and experience to bear. Given a decentralized task planner and a goal- based ...experience to bear. Given a decentralized task planner and a goal- based operator interface for a network of unmanned vehicles in a search, track,
Adaptive Decentralized Control
1985-04-01
and implementation of the decentralized controllers. It raises, however, many difficult questions regarding the conditions under which such a scheme ...adaptive controller, and a general form of the model reference adaptive controller (4]. We believe that this work represents a significant advance in the...Comparing the adaptive system with the tuned system results in the development of a generic adaptive error system. Passivity theory was used to derive
Centralization or decentralization of facial structures in Korean young adults.
Yoo, Ja-Young; Kim, Jeong-Nam; Shin, Kang-Jae; Kim, Soon-Heum; Choi, Hyun-Gon; Jeon, Hyun-Soo; Koh, Ki-Seok; Song, Wu-Chul
2013-05-01
It is well known that facial beauty is dictated by facial type, and harmony between the eyes, nose, and mouth. Furthermore, facial impression is judged according to the overall facial contour and the relationship between the facial structures. The aims of the present study were to determine the optimal criteria for the assessment of gathering or separation of the facial structures and to define standardized ratios for centralization or decentralization of the facial structures.Four different lengths were measured, and 2 indexes were calculated from standardized photographs of 551 volunteers. Centralization and decentralization were assessed using the width index (interpupillary distance / facial width) and height index (eyes-mouth distance / facial height). The mean ranges of the width index and height index were 42.0 to 45.0 and 36.0 to 39.0, respectively. The width index did not differ with sex, but males had more decentralized faces, and females had more centralized faces, vertically. The incidence rate of decentralized faces among the men was 30.3%, and that of centralized faces among the women was 25.2%.The mean ranges in width and height indexes have been determined in a Korean population. Faces with width and height index scores under and over the median ranges are determined to be "centralized" and "decentralized," respectively.
Panda, Bhuputra; Thakur, Harshad P
2016-10-31
One of the principal goals of any health care system is to improve health through the provision of clinical and public health services. Decentralization as a reform measure aims to improve inputs, management processes and health outcomes, and has political, administrative and financial connotations. It is argued that the robustness of a health system in achieving desirable outcomes is contingent upon the width and depth of 'decision space' at the local level. Studies have used different approaches to examine one or more facets of decentralization and its effect on health system functioning; however, lack of consensus on an acceptable framework is a critical gap in determining its quantum and quality. Theorists have resorted to concepts of 'trust', 'convenience' and 'mutual benefits' to explain, define and measure components of governance in health. In the emerging 'continuum of health services' model, the challenge lies in identifying variables of performance (fiscal allocation, autonomy at local level, perception of key stakeholders, service delivery outputs, etc.) through the prism of decentralization in the first place, and in establishing directed relationships among them. This focused review paper conducted extensive web-based literature search, using PubMed and Google Scholar search engines. After screening of key words and study objectives, we retrieved 180 articles for next round of screening. One hundred and four full articles (three working papers and 101 published papers) were reviewed in totality. We attempted to summarize existing literature on decentralization and health systems performance, explain key concepts and essential variables, and develop a framework for further scientific scrutiny. Themes are presented in three separate segments of dimensions, difficulties and derivatives. Evaluation of local decision making and its effect on health system performance has been studied in a compartmentalized manner. There is sparse evidence about innovations
Optical performance of toric intraocular lenses in the presence of decentration
Zhang, Bin; Ma, Jin-Xue; Liu, Dan-Yan; Du, Ying-Hua; Guo, Cong-Rong; Cui, Yue-Xian
2015-01-01
AIM To evaluate the optical performance of toric intraocular lenses (IOLs) after decentration and with different pupil diameters, but with the IOL astigmatic axis aligned. METHODS Optical performances of toric T5 and SN60AT spherical IOLs after decentration were tested on a theoretical pseudophakic model eye based on the Hwey-Lan Liou schematic eye using the Zemax ray-tracing program. Changes in optical performance were analyzed in model eyes with 3-mm, 4-mm, and 5-mm pupil diameters and decentered from 0.25 mm to 0.75 mm with an interval of 5° at the meridian direction from 0° to 90°. The ratio of the modulation transfer function (MTF) between a decentered and a centered IOL (MTFDecentration/MTFCentration) was calculated to analyze the decrease in optical performance. RESULTS Optical performance of the toric IOL remained unchanged when IOLs were decentered in any meridian direction. The MTFs of the two IOLs decreased, whereas optical performance remained equivalent after decentration. The MTFDecentration/MTFCentration ratios of the IOLs at a decentration from 0.25 mm to 0.75 mm were comparable in the toric and SN60AT IOLs. After decentration, MTF decreased further, with the MTF of the toric IOL being slightly lower than that of the SN60AT IOL. Imaging qualities of the two IOLs decreased when the pupil diameter and the degree of decentration increased, but the decrease was similar in the toric and spherical IOLs. CONCLUSIONS Toric IOLs were comparable to spherical IOLs in terms of tolerance to decentration at the correct axial position. PMID:26309871
Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa
2010-08-21
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.
2010-01-01
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200
A Review of Characteristics and Experiences of Decentralization of Education
ERIC Educational Resources Information Center
Mwinjuma, Juma Saidi; Kadir, Suhaida bte Abd.; Hamzah, Azimi; Basri, Ramli
2015-01-01
This paper scrutinizes decentralization of education with reference to some countries around the world. We consider discussion on decentralization to be complex, critical and broad question in the contemporary education planning, administration and politics of education reforms. Even though the debate on and implementation of decentralization…
Effects of health care decentralization in Spain from a citizens' perspective.
Antón, José-Ignacio; Muñoz de Bustillo, Rafael; Fernández Macías, Enrique; Rivera, Jesús
2014-05-01
The aim of this article is to analyze the impact of the decentralization of the public national health system in Spain on citizens' satisfaction with different dimensions of primary and hospital care. Using micro-data from the Health Barometer 1996-2009 and taking advantage of the exogeneity of the different pace of decentralization across Spain using a difference-in-differences strategy, we find that, in general, decentralization has not improved citizens' satisfaction with different features of the health services. In our base model, we find that there are even some small negative effects on a subset of variables. Sensitivity analysis confirms that there is no empirical evidence for supporting that decentralization has had a positive impact on citizens' satisfaction with health care. We outline several possible reasons for this.
[Primary care: decentralization and efficiency].
Pinillos, M; Antoñanzas, F
2002-01-01
The purpose of this study was to evaluate whether the productive behavior of health centers in autonomous communities with competence in health is more efficient than that among centers belonging to Spanish public health system (INSALUD). The technical efficiency of 66 health centers in Alava, Navarre and La Rioja was analyzed. Centers in autonomous communities that in 1997 had been granted complete authority from the central government to manage their healthcare services were compared with centers whose administration, in the same year, was still in the hands of INSALUD. The method used to measure and quantify the efficiency of these centers was data envelopment analysis. Nonparametric contrast of the health centers' mean efficiency rates revealed no significant differences in the (in)efficiency of centers from La Rioja, Navarre and Alava. The results obtained from the model of efficiency measurement used did not indicate that decentralization improves the productive efficiency of primary care centers.
ERIC Educational Resources Information Center
Fiszbein, Ariel, Ed.
This book is about education system reform in Central and Eastern Europe, with emphasis on decentralization and management. In the past, local authorities served as implementation arms of the central ministry, while finance and decision-making were controlled by the central government, leaving local communities with little influence. New education…
A Closed-Loop Hardware Simulation of Decentralized Satellite Formation Control
NASA Technical Reports Server (NTRS)
Ebimuma, Takuji; Lightsey, E. Glenn; Baur, Frank (Technical Monitor)
2002-01-01
In recent years, there has been significant interest in the use of formation flying spacecraft for a variety of earth and space science missions. Formation flying may provide smaller and cheaper satellites that, working together, have more capability than larger and more expensive satellites. Several decentralized architectures have been proposed for autonomous establishment and maintenance of satellite formations. In such architectures, each satellite cooperatively maintains the shape of the formation without a central supervisor, and processing only local measurement information. The Global Positioning System (GPS) sensors are ideally suited to provide such local position and velocity measurements to the individual satellites. An investigation of the feasibility of a decentralized approach to satellite formation flying was originally presented by Carpenter. He extended a decentralized linear-quadratic-Gaussian (LQG) framework proposed by Speyer in a fashion similar to an extended Kalman filter (EKE) which processed GPS position fix solutions. The new decentralized LQG architecture was demonstrated in a numerical simulation for a realistic scenario that is similar to missions that have been proposed by NASA and the U.S. Air Force. Another decentralized architecture was proposed by Park et al. using carrier differential-phase GPS (CDGPS). Recently, Busse et al demonstrated the decentralized CDGPS architecture in a hardware-in-the-loop simulation on the Formation Flying TestBed (FFTB) at Goddard Space Flight Center (GSFC), which features two Spirent Cox 16 channel GPS signal generator. Although representing a step forward by utilizing GPS signal simulators for a spacecraft formation flying simulation, only an open-loop performance, in which no maneuvers were executed based on the real-time state estimates, was considered. In this research, hardware experimentation has been extended to include closed-loop integrated guidance and navigation of multiple spacecraft
NASA Astrophysics Data System (ADS)
Nourifar, Raheleh; Mahdavi, Iraj; Mahdavi-Amiri, Nezam; Paydar, Mohammad Mahdi
2017-09-01
Decentralized supply chain management is found to be significantly relevant in today's competitive markets. Production and distribution planning is posed as an important optimization problem in supply chain networks. Here, we propose a multi-period decentralized supply chain network model with uncertainty. The imprecision related to uncertain parameters like demand and price of the final product is appropriated with stochastic and fuzzy numbers. We provide mathematical formulation of the problem as a bi-level mixed integer linear programming model. Due to problem's convolution, a structure to solve is developed that incorporates a novel heuristic algorithm based on Kth-best algorithm, fuzzy approach and chance constraint approach. Ultimately, a numerical example is constructed and worked through to demonstrate applicability of the optimization model. A sensitivity analysis is also made.
Decentralization in Zambia: resource allocation and district performance.
Bossert, Thomas; Chitah, Mukosha Bona; Bowser, Diana
2003-12-01
Zambia implemented an ambitious process of health sector decentralization in the mid 1990s. This article presents an assessment of the degree of decentralization, called 'decision space', that was allowed to districts in Zambia, and an analysis of data on districts available at the national level to assess allocation choices made by local authorities and some indicators of the performance of the health systems under decentralization. The Zambian officials in health districts had a moderate range of choice over expenditures, user fees, contracting, targeting and governance. Their choices were quite limited over salaries and allowances and they did not have control over additional major sources of revenue, like local taxes. The study found that the formula for allocation of government funding which was based on population size and hospital beds resulted in relatively equal per capita expenditures among districts. Decentralization allowed the districts to make decisions on internal allocation of resources and on user fee levels and expenditures. General guidelines for the allocation of resources established a maximum and minimum percentage to be allocated to district offices, hospitals, health centres and communities. Districts tended to exceed the maximum for district offices, but the large urban districts and those without public district hospitals were not even reaching the minimum for hospital allocations. Wealthier and urban districts were more successful in raising revenue through user fees, although the proportion of total expenditures that came from user fees was low. An analysis of available indicators of performance, such as the utilization of health services, immunization coverage and family planning activities, found little variation during the period 1995-98 except for a decline in immunization coverage, which may have also been affected by changes in donor funding. These findings suggest that decentralization may not have had either a positive or
A step towards decentralized wastewater management in the Lower Jordan Rift Valley.
van Afferden, M; Cardona, J A; Rahman, K Z; Daoud, R; Headley, T; Kilani, Z; Subah, A; Mueller, R A
2010-01-01
In order to address serious concerns over public health, water scarcity and groundwater pollution in Jordan, the expansion of decentralized wastewater treatment and reuse (DWWT&R) systems to small communities is one of the goals defined by the Jordan government in the "Water Strategy 2009-2022". This paper evaluates the general potential of decentralized wastewater system solutions to be applied in a selected area of the Lower Jordan Rift Valley in Jordan. For the study area, the connection degree to sewer systems was calculated as 67% (5% in the rural sector and 75% in the urban sector). The annual wastewater production available for DWWT&R in the rural sector of the investigation area was calculated to be nearly 3.8 million m(3) at the end of 2007. The future need of wastewater treatment and reuse facilities of the rural sector was estimated to be increasing by 0.11 million m(3) year(-1), with an overall potential of new treatment capacity of nearly 15,500 population equivalents (pe) year(-1). The overall potential for implementing DWWT&R systems in the urban sector was estimated as nearly 25 million m(3) of wastewater in 2007. The future need of wastewater treatment and reuse facilities required for the urban sector was estimated to be increasing at a rate of 0.12 million pe year(-1). Together with the decision makers and the stakeholders, a potential map with three regions has been defined: Region 1 with existing central wastewater infrastructure, Region 2 with already planned central infrastructure and Region 3 with the highest potential for implementing DWWT&R systems.
Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan
NASA Astrophysics Data System (ADS)
Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.
2015-08-01
In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.
Reliable Decentralized Control of Fuzzy Discrete-Event Systems and a Test Algorithm.
Liu, Fuchun; Dziong, Zbigniew
2013-02-01
A framework for decentralized control of fuzzy discrete-event systems (FDESs) has been recently presented to guarantee the achievement of a given specification under the joint control of all local fuzzy supervisors. As a continuation, this paper addresses the reliable decentralized control of FDESs in face of possible failures of some local fuzzy supervisors. Roughly speaking, for an FDES equipped with n local fuzzy supervisors, a decentralized supervisor is called k-reliable (1 ≤ k ≤ n) provided that the control performance will not be degraded even when n - k local fuzzy supervisors fail. A necessary and sufficient condition for the existence of k-reliable decentralized supervisors of FDESs is proposed by introducing the notions of M̃uc-controllability and k-reliable coobservability of fuzzy language. In particular, a polynomial-time algorithm to test the k-reliable coobservability is developed by a constructive methodology, which indicates that the existence of k-reliable decentralized supervisors of FDESs can be checked with a polynomial complexity.
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.
Stockton, David B; Santamaria, Fidel
2017-10-01
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
Fully decentralized estimation and control for a modular wheeled mobile robot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mutambara, A.G.O.; Durrant-Whyte, H.F.
2000-06-01
In this paper, the problem of fully decentralized data fusion and control for a modular wheeled mobile robot (WMR) is addressed. This is a vehicle system with nonlinear kinematics, distributed multiple sensors, and nonlinear sensor models. The problem is solved by applying fully decentralized estimation and control algorithms based on the extended information filter. This is achieved by deriving a modular, decentralized kinematic model by using plane motion kinematics to obtain the forward and inverse kinematics for a generalized simple wheeled vehicle. This model is then used in the decentralized estimation and control algorithms. WMR estimation and control is thusmore » obtained locally using reduced order models with reduced communication of information between nodes is carried out after every measurement (full rate communication), the estimates and control signals obtained at each node are equivalent to those obtained by a corresponding centralized system. Transputer architecture is used as the basis for hardware and software design as it supports the extensive communication and concurrency requirements that characterize modular and decentralized systems. The advantages of a modular WMR vehicle include scalability, application flexibility, low prototyping costs, and high reliability.« less
On decentralized adaptive full-order sliding mode control of multiple UAVs.
Xiang, Xianbo; Liu, Chao; Su, Housheng; Zhang, Qin
2017-11-01
In this study, a novel decentralized adaptive full-order sliding mode control framework is proposed for the robust synchronized formation motion of multiple unmanned aerial vehicles (UAVs) subject to system uncertainty. First, a full-order sliding mode surface in a decentralized manner is designed to incorporate both the individual position tracking error and the synchronized formation error while the UAV group is engaged in building a certain desired geometric pattern in three dimensional space. Second, a decentralized virtual plant controller is constructed which allows the embedded low-pass filter to attain the chattering free property of the sliding mode controller. In addition, robust adaptive technique is integrated in the decentralized chattering free sliding control design in order to handle unknown bounded uncertainties, without requirements for assuming a priori knowledge of bounds on the system uncertainties as stated in conventional chattering free control methods. Subsequently, system robustness as well as stability of the decentralized full-order sliding mode control of multiple UAVs is synthesized. Numerical simulation results illustrate the effectiveness of the proposed control framework to achieve robust 3D formation flight of the multi-UAV system. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Linear time-invariant controller design for two-channel decentralized control systems
NASA Technical Reports Server (NTRS)
Desoer, Charles A.; Gundes, A. Nazli
1987-01-01
This paper analyzes a linear time-invariant two-channel decentralized control system with a 2 x 2 strictly proper plant. It presents an algorithm for the algebraic design of a class of decentralized compensators which stabilize the given plant.
Polya's bees: A model of decentralized decision-making.
Golman, Russell; Hagmann, David; Miller, John H
2015-09-01
How do social systems make decisions with no single individual in control? We observe that a variety of natural systems, including colonies of ants and bees and perhaps even neurons in the human brain, make decentralized decisions using common processes involving information search with positive feedback and consensus choice through quorum sensing. We model this process with an urn scheme that runs until hitting a threshold, and we characterize an inherent tradeoff between the speed and the accuracy of a decision. The proposed common mechanism provides a robust and effective means by which a decentralized system can navigate the speed-accuracy tradeoff and make reasonably good, quick decisions in a variety of environments. Additionally, consensus choice exhibits systemic risk aversion even while individuals are idiosyncratically risk-neutral. This too is adaptive. The model illustrates how natural systems make decentralized decisions, illuminating a mechanism that engineers of social and artificial systems could imitate.
Implementing bioinformatic workflows within the bioextract server
USDA-ARS?s Scientific Manuscript database
Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2015-12-01
The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and
An access control model with high security for distributed workflow and real-time application
NASA Astrophysics Data System (ADS)
Han, Ruo-Fei; Wang, Hou-Xiang
2007-11-01
The traditional mandatory access control policy (MAC) is regarded as a policy with strict regulation and poor flexibility. The security policy of MAC is so compelling that few information systems would adopt it at the cost of facility, except some particular cases with high security requirement as military or government application. However, with the increasing requirement for flexibility, even some access control systems in military application have switched to role-based access control (RBAC) which is well known as flexible. Though RBAC can meet the demands for flexibility but it is weak in dynamic authorization and consequently can not fit well in the workflow management systems. The task-role-based access control (T-RBAC) is then introduced to solve the problem. It combines both the advantages of RBAC and task-based access control (TBAC) which uses task to manage permissions dynamically. To satisfy the requirement of system which is distributed, well defined with workflow process and critically for time accuracy, this paper will analyze the spirit of MAC, introduce it into the improved T&RBAC model which is based on T-RBAC. At last, a conceptual task-role-based access control model with high security for distributed workflow and real-time application (A_T&RBAC) is built, and its performance is simply analyzed.
Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.
Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor
2016-01-01
In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.
Vargas Bustamante, Arturo
2010-09-01
This study investigates the effectiveness of centralized and decentralized health care providers in rural Mexico. It compares provider performance since both centralized and decentralized providers co-exist in rural areas of the country. The data are drawn from the 2003 household survey of Oportunidades, a comprehensive study of rural families from seven states in Mexico. The analyses compare out-of-pocket health care expenditures and utilization of preventive care among rural households with access to either centralized or decentralized health care providers. This study benefits from differences in timing of health care decentralization and from a quasi-random distribution of providers. Results show that overall centralized providers perform better. Households served by this organization report less regressive out-of-pocket health care expenditures (32% lower), and observe higher utilization of preventive services (3.6% more). Decentralized providers that were devolved to state governments in the early 1980s observe a slightly better performance than providers that were decentralized in the mid-1990s. These findings are robust to decentralization timing, heterogeneity in per capita government health expenditures, state and health infrastructure effects, and other confounders. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.
2017-12-01
This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.
AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.
New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and
Jealousy Graphs: Structure and Complexity of Decentralized Stable Matching
2013-01-01
REPORT Jealousy Graphs: Structure and Complexity of Decentralized Stable Matching 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The stable matching...Franceschetti 858-822-2284 3. DATES COVERED (From - To) Standard Form 298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 - Jealousy Graphs: Structure and...market. Using this structure, we are able to provide a ner analysis of the complexity of a subclass of decentralized matching markets. Jealousy
Reduced modeling of flexible structures for decentralized control
NASA Technical Reports Server (NTRS)
Yousuff, A.; Tan, T. M.; Bahar, L. Y.; Konstantinidis, M. F.
1986-01-01
Based upon the modified finite element-transfer matrix method, this paper presents a technique for reduced modeling of flexible structures for decentralized control. The modeling decisions are carried out at (finite-) element level, and are dictated by control objectives. A simply supported beam with two sets of actuators and sensors (linear force actuator and linear position and velocity sensors) is considered for illustration. In this case, it is conjectured that the decentrally controlled closed loop system is guaranteed to be at least marginally stable.
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R
Improving adherence to the Epic Beacon ambulatory workflow.
Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana
2017-06-01
Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.
ERIC Educational Resources Information Center
Khanal, Mukunda Mani
2016-01-01
The literature reviewed for this study revealed that the movement toward decentralizing responsibility of school governance to communities has become a global policy in the contemporary world. With the aim of enhancing greater community participation and retaining students in public schools, the Government of Nepal introduced two different…
The standard-based open workflow system in GeoBrain (Invited)
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Zhao, P.; Deng, M.
2013-12-01
GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial
Mannion, Russell; Goddard, Maria; Kuhn, Michael; Bate, Angela
2005-01-01
This article examines the incentive effects of delegating operational and financial decision making from central government to local healthcare providers. It addresses the economic consequences of a contemporary policy initiative in the English National Health Service (NHS)-earned autonomy. This policy entails awarding operational autonomy to 'front-line' organisations that are assessed to be meeting national performance targets. In doing so, it introduces new types of incentives into the healthcare system, changes the nature of established agency relationships and represents a novel approach to performance management. Theoretical elements of a principal-agent model are used to examine the impact of decentralization in the context of the results of an empirical study that elicited the perceptions of senior hospital managers regarding the incentive effects of earned autonomy. A multi-method approach was adopted. In order to capture the breadth of policy impact, we conducted a national postal questionnaire survey of all Chief Executives in acute-care hospital Trusts in England (n = 173). To provide added depth and richness to our understanding of the impact and incentive effects of earned autonomy at an organisational level, we interviewed senior managers in a purposeful sample of eight acute-care hospital Trusts. This theoretical framework and our empirical work suggest that some aspects of the earned autonomy as currently implemented in the NHS serve to weaken the potential incentive effect of decentralization. In particular, the nature of the freedoms is such that many senior managers do not view autonomy as a particularly valuable prize. This suggests that incentives associated with the policy will be insufficiently powerful to motivate providers to deliver better performance. We also found that principal commitment may be a problem in the NHS. Some hospital managers reported that they already enjoyed a large degree of autonomy, regardless of their current
NASA Astrophysics Data System (ADS)
Parker, Pete; Thapa, Brijesh
2012-02-01
Kanchenjunga Conservation Area Project (KCAP) in Nepal is among the first protected areas in the world to institute a completely decentralized system of conservation and development. Proponents of decentralized conservation claim that it increases management efficiency, enhances the responsiveness to local needs, and promotes greater equity among local residents. This study assessed local equity by evaluating the levels of dependencies on natural resources among households and the factors affecting that dependency. Data were collected via detailed surveys among 205 randomly selected households within the KCAP. Natural resource dependency was evaluated by comparing the ratio of total household income to income derived from access to natural resources. Economic, social, and access-related variables were employed to determine potential significant predictors of dependency. Overall, households were heavily dependent on natural resources for their income, especially households at higher elevations and those with more adult members. The households that received remittances were most able to supplement their income and, therefore, drastically reduced their reliance on the access to natural resources. Socio-economic variables, such as land holdings, education, caste, and ethnicity, failed to predict dependency. Household participation in KCAP-sponsored training programs also failed to affect household dependency; however, fewer than 20% of the households had any form of direct contact with KCAP personnel within the past year. The success of the KCAP as a decentralized conservation program is contingent on project capacity-building via social mobilization, training programs, and participatory inclusion in decision making to help alleviate the dependency on natural resources.
Parker, Pete; Thapa, Brijesh
2012-02-01
Kanchenjunga Conservation Area Project (KCAP) in Nepal is among the first protected areas in the world to institute a completely decentralized system of conservation and development. Proponents of decentralized conservation claim that it increases management efficiency, enhances the responsiveness to local needs, and promotes greater equity among local residents. This study assessed local equity by evaluating the levels of dependencies on natural resources among households and the factors affecting that dependency. Data were collected via detailed surveys among 205 randomly selected households within the KCAP. Natural resource dependency was evaluated by comparing the ratio of total household income to income derived from access to natural resources. Economic, social, and access-related variables were employed to determine potential significant predictors of dependency. Overall, households were heavily dependent on natural resources for their income, especially households at higher elevations and those with more adult members. The households that received remittances were most able to supplement their income and, therefore, drastically reduced their reliance on the access to natural resources. Socio-economic variables, such as land holdings, education, caste, and ethnicity, failed to predict dependency. Household participation in KCAP-sponsored training programs also failed to affect household dependency; however, fewer than 20% of the households had any form of direct contact with KCAP personnel within the past year. The success of the KCAP as a decentralized conservation program is contingent on project capacity-building via social mobilization, training programs, and participatory inclusion in decision making to help alleviate the dependency on natural resources.
Deploying and sharing U-Compare workflows as web services.
Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia
2013-02-18
U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.
Deploying and sharing U-Compare workflows as web services
2013-01-01
Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. Results We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. Conclusions The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform. PMID:23419017
Music Libraries: Centralization versus Decentralization.
ERIC Educational Resources Information Center
Kuyper-Rushing, Lois
2002-01-01
Considers the decision that branch libraries, music libraries in particular, have struggled with concerning a centralized location in the main library versus a decentralized collection. Reports on a study of the Association of Research Libraries that investigated the location of music libraries, motivation for the location, degrees offered,…
Decentralized state estimation for a large-scale spatially interconnected system.
Liu, Huabo; Yu, Haisheng
2018-03-01
A decentralized state estimator is derived for the spatially interconnected systems composed of many subsystems with arbitrary connection relations. An optimization problem on the basis of linear matrix inequality (LMI) is constructed for the computations of improved subsystem parameter matrices. Several computationally effective approaches are derived which efficiently utilize the block-diagonal characteristic of system parameter matrices and the sparseness of subsystem connection matrix. Moreover, this decentralized state estimator is proved to converge to a stable system and obtain a bounded covariance matrix of estimation errors under certain conditions. Numerical simulations show that the obtained decentralized state estimator is attractive in the synthesis of a large-scale networked system. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
A new approach to implementing decentralized wastewater treatment concepts.
van Afferden, Manfred; Cardona, Jaime A; Lee, Mi-Yong; Subah, Ali; Müller, Roland A
2015-01-01
Planners and decision-makers in the wastewater sector are often confronted with the problem of identifying adequate development strategies and most suitable finance schemes for decentralized wastewater infrastructure. This paper research has focused on providing an approach in support of such decision-making. It is based on basic principles that stand for an integrated perspective towards sustainable wastewater management. We operationalize these principles by means of a geographic information system (GIS)-based approach 'Assessment of Local Lowest-Cost Wastewater Solutions'--ALLOWS. The main product of ALLOWS is the identification of cost-effective local wastewater management solutions for any given demographic and physical context. By using universally available input data the tool allows decision-makers to compare different wastewater solutions for any given wastewater situation. This paper introduces the ALLOWS-GIS tool. Its application and functionality are illustrated by assessing different wastewater solutions for two neighboring communities in rural Jordan.
Decentralized control of sound radiation using iterative loop recovery.
Schiller, Noah H; Cabell, Randolph H; Fuller, Chris R
2010-10-01
A decentralized model-based control strategy is designed to reduce low-frequency sound radiation from periodically stiffened panels. While decentralized control systems tend to be scalable, performance can be limited due to modeling error introduced by the unmodeled interaction between neighboring control units. Since bounds on modeling error are not known in advance, it is difficult to ensure the decentralized control system will be robust without making the controller overly conservative. Therefore an iterative approach is suggested, which utilizes frequency-shaped loop recovery. The approach accounts for modeling error introduced by neighboring control loops, requires no communication between subsystems, and is relatively simple. The control strategy is evaluated numerically using a model of a stiffened aluminum panel that is representative of the sidewall of an aircraft. Simulations demonstrate that the iterative approach can achieve significant reductions in radiated sound power from the stiffened panel without destabilizing neighboring control units.
Decentralized Control of Sound Radiation Using Iterative Loop Recovery
NASA Technical Reports Server (NTRS)
Schiller, Noah H.; Cabell, Randolph H.; Fuller, Chris R.
2009-01-01
A decentralized model-based control strategy is designed to reduce low-frequency sound radiation from periodically stiffened panels. While decentralized control systems tend to be scalable, performance can be limited due to modeling error introduced by the unmodeled interaction between neighboring control units. Since bounds on modeling error are not known in advance, it is difficult to ensure the decentralized control system will be robust without making the controller overly conservative. Therefore an iterative approach is suggested, which utilizes frequency-shaped loop recovery. The approach accounts for modeling error introduced by neighboring control loops, requires no communication between subsystems, and is relatively simple. The control strategy is evaluated numerically using a model of a stiffened aluminum panel that is representative of the sidewall of an aircraft. Simulations demonstrate that the iterative approach can achieve significant reductions in radiated sound power from the stiffened panel without destabilizing neighboring control units.
Barriers to Decentralized Teacher Education.
ERIC Educational Resources Information Center
Stuhr, Christian
In an effort to meet the demand for off-campus postsecondary education at the degree, diploma, or certificate levels, this report examines the barriers against and reasons for offering decentralized teacher education programs from universities to colleges in rural Canadian provinces. Several reasons exist for the demand for off-campus…
Centralized vs. Decentralized Child Mental Health Services
Adams, Milton S.
1977-01-01
One of the basic tenets of the Community Mental Health Center movement is that services should be provided in the consumers' community. Various centers across the country have attempted to do this in either a centralized or decentralized fashion. Historically, most health services have been provided centrally, a good example being the traditional general hospital with its centralized medical services. Over the years, some of these services have become decentralized to take the form of local health centers, health maintenance organizations, community clinics, etc, and now various large mental health centers are also being broken down into smaller community units. An example of each type of mental health facility is delineated here. PMID:904014
Centralized vs. decentralized child mental health services.
Adams, M S
1977-09-01
One of the basic tenets of the Community Mental Health Center movement is that services should be provided in the consumers' community. Various centers across the country have attempted to do this in either a centralized or decentralized fashion. Historically, most health services have been provided centrally, a good example being the traditional general hospital with its centralized medical services. Over the years, some of these services have become decentralized to take the form of local health centers, health maintenance organizations, community clinics, etc, and now various large mental health centers are also being broken down into smaller community units. An example of each type of mental health facility is delineated here.
Decentralized care for multidrug-resistant tuberculosis: a systematic review and meta-analysis.
Ho, Jennifer; Byrne, Anthony L; Linh, Nguyen N; Jaramillo, Ernesto; Fox, Greg J
2017-08-01
To assess the effectiveness of decentralized treatment and care for patients with multidrug-resistant (MDR) tuberculosis, in comparison with centralized approaches. We searched ClinicalTrials.gov, the Cochrane library, Embase®, Google Scholar, LILACS, PubMed®, Web of Science and the World Health Organization's portal of clinical trials for studies reporting treatment outcomes for decentralized and centralized care of MDR tuberculosis. The primary outcome was treatment success. When possible, we also evaluated, death, loss to follow-up, treatment adherence and health-system costs. To obtain pooled relative risk (RR) estimates, we performed random-effects meta-analyses. Eight studies met the eligibility criteria for review inclusion. Six cohort studies, with 4026 participants in total, reported on treatment outcomes. The pooled RR estimate for decentralized versus centralized care for treatment success was 1.13 (95% CI: 1.01-1.27). The corresponding estimate for loss to follow-up was RR: 0.66 (95% CI: 0.38-1.13), for death RR: 1.01 (95% CI: 0.67-1.52) and for treatment failure was RR: 1.07 (95% CI: 0.48-2.40). Two of three studies evaluating health-care costs reported lower costs for the decentralized models of care than for the centralized models. Treatment success was more likely among patients with MDR tuberculosis treated using a decentralized approach. Further studies are required to explore the effectiveness of decentralized MDR tuberculosis care in a range of different settings.
Impact of digital radiography on clinical workflow.
May, G A; Deer, D D; Dackiewicz, D
2000-05-01
It is commonly accepted that digital radiography (DR) improves workflow and patient throughput compared with traditional film radiography or computed radiography (CR). DR eliminates the film development step and the time to acquire the image from a CR reader. In addition, the wide dynamic range of DR is such that the technologist can perform the quality-control (QC) step directly at the modality in a few seconds, rather than having to transport the newly acquired image to a centralized QC station for review. Furthermore, additional workflow efficiencies can be achieved with DR by employing tight radiology information system (RIS) integration. In the DR imaging environment, this provides for patient demographic information to be automatically downloaded from the RIS to populate the DR Digital Imaging and Communications in Medicine (DICOM) image header. To learn more about this workflow efficiency improvement, we performed a comparative study of workflow steps under three different conditions: traditional film/screen x-ray, DR without RIS integration (ie, manual entry of patient demographics), and DR with RIS integration. This study was performed at the Cleveland Clinic Foundation (Cleveland, OH) using a newly acquired amorphous silicon flat-panel DR system from Canon Medical Systems (Irvine, CA). Our data show that DR without RIS results in substantial workflow savings over traditional film/screen practice. There is an additional 30% reduction in total examination time using DR with RIS integration.
Conventions and workflows for using Situs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wriggers, Willy, E-mail: wriggers@biomachina.org
2012-04-01
Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs tomore » be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed.« less
Decentralized adaptive control
NASA Technical Reports Server (NTRS)
Oh, B. J.; Jamshidi, M.; Seraji, H.
1988-01-01
A decentralized adaptive control is proposed to stabilize and track the nonlinear, interconnected subsystems with unknown parameters. The adaptation of the controller gain is derived by using model reference adaptive control theory based on Lyapunov's direct method. The adaptive gains consist of sigma, proportional, and integral combination of the measured and reference values of the corresponding subsystem. The proposed control is applied to the joint control of a two-link robot manipulator, and the performance in computer simulation corresponds with what is expected in theoretical development.
Algorithms for output feedback, multiple-model, and decentralized control problems
NASA Technical Reports Server (NTRS)
Halyo, N.; Broussard, J. R.
1984-01-01
The optimal stochastic output feedback, multiple-model, and decentralized control problems with dynamic compensation are formulated and discussed. Algorithms for each problem are presented, and their relationship to a basic output feedback algorithm is discussed. An aircraft control design problem is posed as a combined decentralized, multiple-model, output feedback problem. A control design is obtained using the combined algorithm. An analysis of the design is presented.
Standardizing clinical trials workflow representation in UML for international site comparison.
de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo
2010-11-09
With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials
Standardizing Clinical Trials Workflow Representation in UML for International Site Comparison
de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M. O.; Rodrigues, Maria J.; Shah, Jatin; Loures, Marco R.; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo
2010-01-01
Background With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Methods Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Results Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. Conclusions This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative
Chan, Adrienne K; Mateyu, Gabriel; Jahn, Andreas; Schouten, Erik; Arora, Paul; Mlotha, William; Kambanji, Marion; van Lettow, Monique
2010-06-01
To assess the effect of decentralization (DC) of antiretroviral therapy (ART) provision in a rural district of Malawi using an integrated primary care model. Between October 2004 and December 2008, 8093 patients (63% women) were registered for ART. Of these, 3440 (43%) were decentralized to health centres for follow-up ART care. We applied multivariate regression analysis that adjusted for sex, age, clinical stage at initiation, type of regimen, presence of side effects because of ART, and duration of treatment and follow-up at site of analysis. Patients managed at health centres had lower mortality [adjusted OR 0.19 (95% C.I. 0.15-0.25)] and lower loss to follow-up (defaulted from treatment) [adjusted OR 0.48 (95% C.I. 0.40-0.58)]. During the first 10 months of follow-up, those decentralized to health centres were approximately 60% less likely to default than those not decentralized; and after 10 months of follow-up, 40% less likely to default. DC was significantly associated with a reduced risk of death from 0 to 25 months of follow-up. The lower mortality may be explained by the selection of stable patients for DC, and the mentorship and supportive supervision of lower cadre health workers to identify and refer complicated cases. Decentralization of follow-up ART care to rural health facilities, using an integrated primary care model, appears a safe and effective way to rapidly scale-up ART and improves both geographical equity in access to HIV-related services and adherence to ART.
ERIC Educational Resources Information Center
Stinnette, Lynn J.
Administrators are looking at decentralization as a solution to issues troubling schools, teachers, and students. The notion of decentralization is accompanied by two assumptions. First, decentralization will produce an improvement in education because classroom decision making will be more responsive to the specific needs of a school. Second, in…
Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J
To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.
A standard-enabled workflow for synthetic biology.
Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach
2017-06-15
A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.
Mominah, Maher; Yunus, Faisel; Househ, Mowafa S
2013-01-01
Computerized provider order entry (CPOE) is a health informatics system that helps health care providers create and manage orders for medications and other health care services. Through the automation of the ordering process, CPOE has improved the overall efficiency of hospital processes and workflow. In Saudi Arabia, CPOE has been used for years, with only a few studies evaluating the impacts of CPOE on clinical workflow. In this paper, we discuss the experience of a local hospital with the use of CPOE and its impacts on clinical workflow. Results show that there are many issues related to the implementation and use of CPOE within Saudi Arabia that must be addressed, including design, training, medication errors, alert fatigue, and system dep Recommendations for improving CPOE use within Saudi Arabia are also discussed.
CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.
2013-12-01
As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over
Scientific workflows as productivity tools for drug discovery.
Shon, John; Ohkawa, Hitomi; Hammer, Juergen
2008-05-01
Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.
Polya’s bees: A model of decentralized decision-making
Golman, Russell; Hagmann, David; Miller, John H.
2015-01-01
How do social systems make decisions with no single individual in control? We observe that a variety of natural systems, including colonies of ants and bees and perhaps even neurons in the human brain, make decentralized decisions using common processes involving information search with positive feedback and consensus choice through quorum sensing. We model this process with an urn scheme that runs until hitting a threshold, and we characterize an inherent tradeoff between the speed and the accuracy of a decision. The proposed common mechanism provides a robust and effective means by which a decentralized system can navigate the speed-accuracy tradeoff and make reasonably good, quick decisions in a variety of environments. Additionally, consensus choice exhibits systemic risk aversion even while individuals are idiosyncratically risk-neutral. This too is adaptive. The model illustrates how natural systems make decentralized decisions, illuminating a mechanism that engineers of social and artificial systems could imitate. PMID:26601255
Kolawole, Grace O.; Gilbert, Hannah N.; Dadem, Nancin Y.; Genberg, Becky L.; Agbaji, Oche O.
2017-01-01
Background. Decentralization of care and treatment for HIV infection in Africa makes services available in local health facilities. Decentralization has been associated with improved retention and comparable or superior treatment outcomes, but patient experiences are not well understood. Methods. We conducted a qualitative study of patient experiences in decentralized HIV care in Plateau State, north central Nigeria. Five decentralized care sites in the Plateau State Decentralization Initiative were purposefully selected. Ninety-three patients and 16 providers at these sites participated in individual interviews and focus groups. Data collection activities were audio-recorded and transcribed. Transcripts were inductively content analyzed to derive descriptive categories representing patient experiences of decentralized care. Results. Patient participants in this study experienced the transition to decentralized care as a series of “trade-offs.” Advantages cited included saving time and money on travel to clinic visits, avoiding dangers on the road, and the “family-like atmosphere” found in some decentralized clinics. Disadvantages were loss of access to ancillary services, reduced opportunities for interaction with providers, and increased risk of disclosure. Participants preferred decentralized services overall. Conclusion. Difficulty and cost of travel remain a fundamental barrier to accessing HIV care outside urban centers, suggesting increased availability of community-based services will be enthusiastically received. PMID:28331636
Kolawole, Grace O; Gilbert, Hannah N; Dadem, Nancin Y; Genberg, Becky L; Agaba, Patricia A; Okonkwo, Prosper; Agbaji, Oche O; Ware, Norma C
2017-01-01
Background. Decentralization of care and treatment for HIV infection in Africa makes services available in local health facilities. Decentralization has been associated with improved retention and comparable or superior treatment outcomes, but patient experiences are not well understood. Methods. We conducted a qualitative study of patient experiences in decentralized HIV care in Plateau State, north central Nigeria. Five decentralized care sites in the Plateau State Decentralization Initiative were purposefully selected. Ninety-three patients and 16 providers at these sites participated in individual interviews and focus groups. Data collection activities were audio-recorded and transcribed. Transcripts were inductively content analyzed to derive descriptive categories representing patient experiences of decentralized care. Results. Patient participants in this study experienced the transition to decentralized care as a series of "trade-offs." Advantages cited included saving time and money on travel to clinic visits, avoiding dangers on the road, and the "family-like atmosphere" found in some decentralized clinics. Disadvantages were loss of access to ancillary services, reduced opportunities for interaction with providers, and increased risk of disclosure. Participants preferred decentralized services overall. Conclusion. Difficulty and cost of travel remain a fundamental barrier to accessing HIV care outside urban centers, suggesting increased availability of community-based services will be enthusiastically received.
Workflow Challenges of Enterprise Imaging: HIMSS-SIIM Collaborative White Paper.
Towbin, Alexander J; Roth, Christopher J; Bronkalla, Mark; Cram, Dawn
2016-10-01
With the advent of digital cameras, there has been an explosion in the number of medical specialties using images to diagnose or document disease and guide interventions. In many specialties, these images are not added to the patient's electronic medical record and are not distributed so that other providers caring for the patient can view them. As hospitals begin to develop enterprise imaging strategies, they have found that there are multiple challenges preventing the implementation of systems to manage image capture, image upload, and image management. This HIMSS-SIIM white paper will describe the key workflow challenges related to enterprise imaging and offer suggestions for potential solutions to these challenges.
Decentralized care for multidrug-resistant tuberculosis: a systematic review and meta-analysis
Byrne, Anthony L; Linh, Nguyen N; Jaramillo, Ernesto; Fox, Greg J
2017-01-01
Abstract Objective To assess the effectiveness of decentralized treatment and care for patients with multidrug-resistant (MDR) tuberculosis, in comparison with centralized approaches. Methods We searched ClinicalTrials.gov, the Cochrane library, Embase®, Google Scholar, LILACS, PubMed®, Web of Science and the World Health Organization’s portal of clinical trials for studies reporting treatment outcomes for decentralized and centralized care of MDR tuberculosis. The primary outcome was treatment success. When possible, we also evaluated, death, loss to follow-up, treatment adherence and health-system costs. To obtain pooled relative risk (RR) estimates, we performed random-effects meta-analyses. Findings Eight studies met the eligibility criteria for review inclusion. Six cohort studies, with 4026 participants in total, reported on treatment outcomes. The pooled RR estimate for decentralized versus centralized care for treatment success was 1.13 (95% CI: 1.01–1.27). The corresponding estimate for loss to follow-up was RR: 0.66 (95% CI: 0.38–1.13), for death RR: 1.01 (95% CI: 0.67–1.52) and for treatment failure was RR: 1.07 (95% CI: 0.48–2.40). Two of three studies evaluating health-care costs reported lower costs for the decentralized models of care than for the centralized models. Conclusion Treatment success was more likely among patients with MDR tuberculosis treated using a decentralized approach. Further studies are required to explore the effectiveness of decentralized MDR tuberculosis care in a range of different settings. PMID:28804170
CamBAfx: Workflow Design, Implementation and Application for Neuroimaging
Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John
2009-01-01
CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470
ERIC Educational Resources Information Center
Welsh, Thomas; McGinn, Noel F.
Decentralization is arguably one of the most important phenomena to come on to the educational planning agenda in the last 15 years. Why a country should decentralize its educational decision-making process and which decisions should be decentralized are two questions that many decision-makers raise. This booklet is intended to provide educational…
A software tool to analyze clinical workflows from direct observations.
Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander
2015-01-01
Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.
Decentralized Modular Systems Versus Centralized Systems.
ERIC Educational Resources Information Center
Crossey, R. E.
Building design, planning, and construction programing for modular decentralized mechanical building systems are outlined in terms of costs, performance, expansion and flexibility. Design strategy, approach, and guidelines for implementing such systems for buildings are suggested, with emphasis on mechanical equipment and building element…
Satellite Power System (SPS) centralization/decentralization
NASA Technical Reports Server (NTRS)
Naisbitt, J.
1978-01-01
The decentralization of government in the United States of America is described and its effect on the solution of energy problems is given. The human response to the introduction of new technologies is considered as well as the behavioral aspects of multiple options.
Northeastern Illinois RTA Decentralized Paratransit Brokerage Program
DOT National Transportation Integrated Search
1982-09-01
This document presents a review and assessment of the Northeastern Illinois Regional Transportation Authority's (RTA) Paratransit Brokerage Demonstration Program which involved six projects implemented by local governments under RTA's decentralized b...
Identifying impact of software dependencies on replicability of biomedical workflows.
Miksa, Tomasz; Rauber, Andreas; Mina, Eleni
2016-12-01
Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.
Workflow and Electronic Health Records in Small Medical Practices
Ramaiah, Mala; Subrahmanian, Eswaran; Sriram, Ram D; Lide, Bettijoyce B
2012-01-01
This paper analyzes the workflow and implementation of electronic health record (EHR) systems across different functions in small physician offices. We characterize the differences in the offices based on the levels of computerization in terms of workflow, sources of time delay, and barriers to using EHR systems to support the entire workflow. The study was based on a combination of questionnaires, interviews, in situ observations, and data collection efforts. This study was not intended to be a full-scale time-and-motion study with precise measurements but was intended to provide an overview of the potential sources of delays while performing office tasks. The study follows an interpretive model of case studies rather than a large-sample statistical survey of practices. To identify time-consuming tasks, workflow maps were created based on the aggregated data from the offices. The results from the study show that specialty physicians are more favorable toward adopting EHR systems than primary care physicians are. The barriers to adoption of EHR systems by primary care physicians can be attributed to the complex workflows that exist in primary care physician offices, leading to nonstandardized workflow structures and practices. Also, primary care physicians would benefit more from EHR systems if the systems could interact with external entities. PMID:22737096
NASA Astrophysics Data System (ADS)
Yang, Xin; He, Zhen-yu; Jiang, Xiao-bo; Lin, Mao-sheng; Zhong, Ning-shan; Hu, Jiang; Qi, Zhen-yu; Bao, Yong; Li, Qiao-qiao; Li, Bao-yue; Hu, Lian-ying; Lin, Cheng-guang; Gao, Yuan-hong; Liu, Hui; Huang, Xiao-yan; Deng, Xiao-wu; Xia, Yun-fei; Liu, Meng-zhong; Sun, Ying
2017-03-01
To meet the special demands in China and the particular needs for the radiotherapy department, a MOSAIQ Integration Platform CHN (MIP) based on the workflow of radiation therapy (RT) has been developed, as a supplement system to the Elekta MOSAIQ. The MIP adopts C/S (client-server) structure mode, and its database is based on the Treatment Planning System (TPS) and MOSAIQ SQL Server 2008, running on the hospital local network. Five network servers, as a core hardware, supply data storage and network service based on the cloud services. The core software, using C# programming language, is developed based on Microsoft Visual Studio Platform. The MIP server could offer network service, including entry, query, statistics and print information for about 200 workstations at the same time. The MIP was implemented in the past one and a half years, and some practical patient-oriented functions were developed. And now the MIP is almost covering the whole workflow of radiation therapy. There are 15 function modules, such as: Notice, Appointment, Billing, Document Management (application/execution), System Management, and so on. By June of 2016, recorded data in the MIP are as following: 13546 patients, 13533 plan application, 15475 RT records, 14656 RT summaries, 567048 billing records and 506612 workload records, etc. The MIP based on the RT workflow has been successfully developed and clinically implemented with real-time performance, data security, stable operation. And it is demonstrated to be user-friendly and is proven to significantly improve the efficiency of the department. It is a key to facilitate the information sharing and department management. More functions can be added or modified for further enhancement its potentials in research and clinical practice.
Modeling and stability of segmented reflector telescopes - A decentralized approach
NASA Technical Reports Server (NTRS)
Ryaciotaki-Boussalis, Helen A.; Ih, Che-Hang Charles
1990-01-01
The decentralization of a segmented reflector telescope based on a finite-element model of its structure is considered. The decentralization of the system at the panel level is considered. Each panel is originally treated as an isolated subsystem so that the controller design is performed independently at the local level, and then applied to the composite system for stability analysis. The panel-level control laws were designed by means of pole placement using local output feedback. Simulation results show a better 1000:1 vibration attenuation in panel position when compared to the open-loop system. It is shown that the overall closed-loop system is exponentially stable provided that certain conditions are met. The advantage to the decentralized approach is that the design is performed in terms of the low-dimensionality subsystems, thus drastically reducing the design computational complexities.
Patterson, Emily S.; Lowry, Svetlana Z.; Ramaiah, Mala; Gibbons, Michael C.; Brick, David; Calco, Robert; Matton, Greg; Miller, Anne; Makar, Ellen; Ferrer, Jorge A.
2015-01-01
NIST recommendations to improve workflow in ambulatory care using an EHR provide a first step in moving from a billing-centered perspective on how to maintain accurate, comprehensive, and up-to-date information about a group of patients to a clinician-centered perspective. These recommendations point the way towards a “patient visit management system,” which incorporates broader notions of supporting workload management, supporting flexible flow of patients and tasks, enabling accountable distributed work across members of the clinical team, and supporting dynamic tracking of steps in tasks that have longer time distributions. PMID:26290887
Decaf: Decoupled Dataflows for In Situ High-Performance Workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreher, M.; Peterka, T.
Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steeringmore » based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.« less
The Effect of Fiscal Decentralization on Under-five Mortality in Iran: A Panel Data Analysis.
Samadi, Ali Hussein; Keshtkaran, Ali; Kavosi, Zahra; Vahedi, Sajad
2013-11-01
Fiscal Decentralization (FD) in many cases is encouraged as a strong means of improving the efficiency and equity in the provision of public goods, such as healthcare services. This issue has urged the researchers to experimentally examine the relationship between fiscal decentralization indicators and health outcomes. In this study we examine the effect of Fiscal Decentralization in Medical Universities (FDMU) and Fiscal Decentralization in Provincial Revenues (FDPR) on Under-Five Mortality Rate (U5M) in provinces of Iran over the period between 2007 and 2010. We employed panel data methods in this article. The results of the Pesaran CD test demonstrated that most of the variables used in the analysis were cross-sectionally dependent. The Hausman test results suggested that fixed-effects were more appropriate to estimate our model. We estimated the fixed-effect model by using Driscoll-Kraay standard errors as a remedy for cross-sectional dependency. According to the findings of this research, fiscal decentralization in the health sector had a negative impact on U5M. On the other hand, fiscal decentralization in provincial revenues had a positive impact on U5M. In addition, U5M had a negative association with the density of physicians, hospital beds, and provincial GDP per capita, but a positive relationship with Gini coefficient and unemployment. The findings of our study indicated that fiscal decentralization should be emphasized in the health sector. The results suggest the need for caution in the implementation of fiscal decentralization in provincial revenues.
The MPO system for automatic workflow documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abla, G.; Coviello, E. N.; Flanagan, S. M.
Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. Here, this article presents the Metadata, Provenance, and Ontology (MPO) System, the softwaremore » that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.« less
The MPO system for automatic workflow documentation
Abla, G.; Coviello, E. N.; Flanagan, S. M.; ...
2016-04-18
Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. Here, this article presents the Metadata, Provenance, and Ontology (MPO) System, the softwaremore » that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.« less
The Effect of Political Decentralization on School Leadership in German Vocational Schools
ERIC Educational Resources Information Center
Gessler, Michael; Ashmawy, Iman K.
2016-01-01
In this explorative qualitative study the effect of political decentralization on vocational school leadership is investigated. Through conducting structural interviews with 15 school principals in the states of Bremen and Lower Saxony in Germany, the study was able to conclude that political decentralization entails the creation of elected bodies…
Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M
2008-08-08
Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept
Decentralized Grid Scheduling with Evolutionary Fuzzy Systems
NASA Astrophysics Data System (ADS)
Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander
In this paper, we address the problem of finding workload exchange policies for decentralized Computational Grids using an Evolutionary Fuzzy System. To this end, we establish a non-invasive collaboration model on the Grid layer which requires minimal information about the participating High Performance and High Throughput Computing (HPC/HTC) centers and which leaves the local resource managers completely untouched. In this environment of fully autonomous sites, independent users are assumed to submit their jobs to the Grid middleware layer of their local site, which in turn decides on the delegation and execution either on the local system or on remote sites in a situation-dependent, adaptive way. We find for different scenarios that the exchange policies show good performance characteristics not only with respect to traditional metrics such as average weighted response time and utilization, but also in terms of robustness and stability in changing environments.
MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laub, S; Dunn, M; Galbreath, G
2015-06-15
Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or greenmore » according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient
Text mining meets workflow: linking U-Compare with Taverna
Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia
2010-01-01
Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690
Mohammed, Abrar Juhar; Inoue, Makoto
2014-06-15
This paper posits a Modified Actor-Power-Accountability Framework (MAPAF) that makes three major improvements on the Actor-Power-Accountability Framework (APAF) developed by Agrawal and Ribot (1999). These improvements emphasize the nature of decentralized property rights, linking the outputs of decentralization with its outcomes and the inclusion of contextual factors. Applying MAPAF to analyze outputs and outcomes from two major decentralized forest policies in Ethiopia, i.e., delegation and devolution, has demonstrated the following strengths of the framework. First, by incorporating vital bundles of property rights into APAF, MAPAF creates a common ground for exploring and comparing the extent of democratization achieved by different decentralizing reforms. Second, the inclusion of social and environmental outcomes in MAPAF makes it possible to link the output of decentralization with local level outcomes. Finally, the addition of contextual factors enhances MAPAF's explanatory power by providing room for investigating exogenous factors other than democratization that contribute to the outcomes of decentralization reforms. Copyright © 2014 Elsevier Ltd. All rights reserved.
Integrating prediction, provenance, and optimization into high energy workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schram, M.; Bansal, V.; Friese, R. D.
We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.
A review on full-scale decentralized wastewater treatment systems: techno-economical approach.
Singh, Nitin Kumar; Kazmi, A A; Starkl, M
2015-01-01
As a solution to the shortcomings of centralized systems, over the last two decades large numbers of decentralized wastewater treatment plants of different technology types have been installed all over the world. This paper aims at deriving lessons learned from existing decentralized wastewater treatment plants that are relevant for smaller towns (and peri-urban areas) as well as rural communities in developing countries, such as India. Only full-scale implemented decentralized wastewater treatment systems are reviewed in terms of performance, land area requirement, capital cost, and operation and maintenance costs. The results are presented in tables comparing different technology types with respect to those parameters.
A discrete decentralized variable structure robotic controller
NASA Technical Reports Server (NTRS)
Tumeh, Zuheir S.
1989-01-01
A decentralized trajectory controller for robotic manipulators is designed and tested using a multiprocessor architecture and a PUMA 560 robot arm. The controller is made up of a nominal model-based component and a correction component based on a variable structure suction control approach. The second control component is designed using bounds on the difference between the used and actual values of the model parameters. Since the continuous manipulator system is digitally controlled along a trajectory, a discretized equivalent model of the manipulator is used to derive the controller. The motivation for decentralized control is that the derived algorithms can be executed in parallel using a distributed, relatively inexpensive, architecture where each joint is assigned a microprocessor. Nonlinear interaction and coupling between joints is treated as a disturbance torque that is estimated and compensated for.
WRF4SG: A Scientific Gateway for climate experiment workflows
NASA Astrophysics Data System (ADS)
Blanco, Carlos; Cofino, Antonio S.; Fernandez-Quiruelas, Valvanuz
2013-04-01
The Weather Research and Forecasting model (WRF) is a community-driven and public domain model widely used by the weather and climate communities. As opposite to other application-oriented models, WRF provides a flexible and computationally-efficient framework which allows solving a variety of problems for different time-scales, from weather forecast to climate change projection. Furthermore, WRF is also widely used as a research tool in modeling physics, dynamics, and data assimilation by the research community. Climate experiment workflows based on Weather Research and Forecasting (WRF) are nowadays among the one of the most cutting-edge applications. These workflows are complex due to both large storage and the huge number of simulations executed. In order to manage that, we have developed a scientific gateway (SG) called WRF for Scientific Gateway (WRF4SG) based on WS-PGRADE/gUSE and WRF4G frameworks to ease achieve WRF users needs (see [1] and [2]). WRF4SG provides services for different use cases that describe the different interactions between WRF users and the WRF4SG interface in order to show how to run a climate experiment. As WS-PGRADE/gUSE uses portlets (see [1]) to interact with users, its portlets will support these use cases. A typical experiment to be carried on by a WRF user will consist on a high-resolution regional re-forecast. These re-forecasts are common experiments used as input data form wind power energy and natural hazards (wind and precipitation fields). In the cases below, the user is able to access to different resources such as Grid due to the fact that WRF needs a huge amount of computing resources in order to generate useful simulations: * Resource configuration and user authentication: The first step is to authenticate on users' Grid resources by virtual organizations. After login, the user is able to select which virtual organization is going to be used by the experiment. * Data assimilation: In order to assimilate the data sources
Decentralized control of Markovian decision processes: Existence Sigma-admissable policies
NASA Technical Reports Server (NTRS)
Greenland, A.
1980-01-01
The problem of formulating and analyzing Markov decision models having decentralized information and decision patterns is examined. Included are basic examples as well as the mathematical preliminaries needed to understand Markov decision models and, further, to superimpose decentralized decision structures on them. The notion of a variance admissible policy for the model is introduced and it is proved that there exist (possibly nondeterministic) optional policies from the class of variance admissible policies. Directions for further research are explored.
DQM: Decentralized Quadratically Approximated Alternating Direction Method of Multipliers
NASA Astrophysics Data System (ADS)
Mokhtari, Aryan; Shi, Wei; Ling, Qing; Ribeiro, Alejandro
2016-10-01
This paper considers decentralized consensus optimization problems where nodes of a network have access to different summands of a global objective function. Nodes cooperate to minimize the global objective by exchanging information with neighbors only. A decentralized version of the alternating directions method of multipliers (DADMM) is a common method for solving this category of problems. DADMM exhibits linear convergence rate to the optimal objective but its implementation requires solving a convex optimization problem at each iteration. This can be computationally costly and may result in large overall convergence times. The decentralized quadratically approximated ADMM algorithm (DQM), which minimizes a quadratic approximation of the objective function that DADMM minimizes at each iteration, is proposed here. The consequent reduction in computational time is shown to have minimal effect on convergence properties. Convergence still proceeds at a linear rate with a guaranteed constant that is asymptotically equivalent to the DADMM linear convergence rate constant. Numerical results demonstrate advantages of DQM relative to DADMM and other alternatives in a logistic regression problem.
An empirical examination of the impacts of decentralized nursing unit design.
Pati, Debajyoti; Harvey, Thomas E; Redden, Pamela; Summers, Barbara; Pati, Sipra
2015-01-01
The objective of the study was to examine the impact of decentralization on operational efficiency, staff well-being, and teamwork on three inpatient units. Decentralized unit operations and the corresponding physical design solution were hypothesized to positively affect several concerns-productive use of nursing time, staff stress, walking distances, and teamwork, among others. With a wide adoption of the concept, empirical evidence on the impact of decentralization was warranted. A multimethod, before-and-after, quasi-experimental design was adopted for the study, focusing on five issues, namely, (1) how nurses spend their time, (2) walking distance, (3) acute stress, (4) productivity, and (5) teamwork. Data on all five issues were collected on three older units with centralized operational model (before move). The same set of data, with identical tools and measures, were collected on the same units after move in to new physical units with decentralized operational model. Data were collected during spring and fall of 2011. Documentation, nurse station use, medication room use, and supplies room use showed consistent change across the three units. Walking distance increased (statistically significant) on two of the three units. Self-reported level of collaboration decreased, although assessment of the physical facility for collaboration increased. Decentralized nursing and physical design models potentially result in quality of work improvements associated with documentation, medication, and supplies. However, there are unexpected consequences associated with walking, and staff collaboration and teamwork. The solution to the unexpected consequences may lie in operational interventions and greater emphasis on culture change. © The Author(s) 2015.
Prototype of Kepler Processing Workflows For Microscopy And Neuroinformatics
Astakhov, V.; Bandrowski, A.; Gupta, A.; Kulungowski, A.W.; Grethe, J.S.; Bouwer, J.; Molina, T.; Rowley, V.; Penticoff, S.; Terada, M.; Wong, W.; Hakozaki, H.; Kwon, O.; Martone, M.E.; Ellisman, M.
2016-01-01
We report on progress of employing the Kepler workflow engine to prototype “end-to-end” application integration workflows that concern data coming from microscopes deployed at the National Center for Microscopy Imaging Research (NCMIR). This system is built upon the mature code base of the Cell Centered Database (CCDB) and integrated rule-oriented data system (IRODS) for distributed storage. It provides integration with external projects such as the Whole Brain Catalog (WBC) and Neuroscience Information Framework (NIF), which benefit from NCMIR data. We also report on specific workflows which spawn from main workflows and perform data fusion and orchestration of Web services specific for the NIF project. This “Brain data flow” presents a user with categorized information about sources that have information on various brain regions. PMID:28479932
Theoretical Perspectives on School District Decentralization.
ERIC Educational Resources Information Center
O'Shea, David
Drawing largely on data from Los Angeles, but with reference to other cities where appropriate, this paper attempts to clarify the distinctive positions taken by advocates of community control as opposed to proponents of administrative decentralization. While community control is essentially a political demand, oriented toward citizens influencing…
Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh
2017-11-01
This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
de Castro, Alberto; Rosales, Patricia; Marcos, Susana
2007-03-01
To measure tilt and decentration of intraocular lenses (IOLs) with Scheimpflug and Purkinje imaging systems in physical model eyes with known amounts of tilt and decentration and patients. Instituto de Optica Daza de Valdés, Consejo Superior de Investigaciones Científicas, Madrid, Spain. Measurements of IOL tilt and decentration were obtained using a commercial Scheimpflug system (Pentacam, Oculus), custom algorithms, and a custom-built Purkinje imaging apparatus. Twenty-five Scheimpflug images of the anterior segment of the eye were obtained at different meridians. Custom algorithms were used to process the images (correction of geometrical distortion, edge detection, and curve fittings). Intraocular lens tilt and decentration were estimated by fitting sinusoidal functions to the projections of the pupillary axis and IOL axis in each image. The Purkinje imaging system captures pupil images showing reflections of light from the anterior corneal surface and anterior and posterior lens surfaces. Custom algorithms were used to detect the Purkinje image locations and estimate IOL tilt and decentration based on a linear system equation and computer eye models with individual biometry. Both methods were validated with a physical model eye in which IOL tilt and decentration can be set nominally. Twenty-one eyes of 12 patients with IOLs were measured with both systems. Measurements of the physical model eye showed an absolute discrepancy between nominal and measured values of 0.279 degree (Purkinje) and 0.243 degree (Scheimpflug) for tilt and 0.094 mm (Purkinje) and 0.228 mm (Scheimpflug) for decentration. In patients, the mean tilt was less than 2.6 degrees and the mean decentration less than 0.4 mm. Both techniques showed mirror symmetry between right eyes and left eyes for tilt around the vertical axis and for decentration in the horizontal axis. Both systems showed high reproducibility. Validation experiments on physical model eyes showed slightly higher accuracy
Bossert, Thomas J; Bowser, Diana M; Amenyah, Johnnie K
2007-03-01
Efficient logistics systems move essential medicines down the supply chain to the service delivery point, and then to the end user. Experts on logistics systems tend to see the supply chain as requiring centralized control to be most effective. However, many health reforms have involved decentralization, which experts fear has disrupted the supply chain and made systems less effective. There is no consensus on an appropriate methodology for assessing the effectiveness of decentralization in general, and only a few studies have attempted to address decentralization of logistics systems. This paper sets out a framework and methodology of a pioneering exploratory study that examines the experiences of decentralization in two countries, Guatemala and Ghana, and presents suggestive results of how decentralization affected the performance of their logistics systems. The analytical approach assessed decentralization using the principal author's 'decision space' approach, which defines decentralization as the degree of choice that local officials have over different health system functions. In this case the approach focused on 15 different logistics functions and measured the relationship between the degree of choice and indicators of performance for each of the functions. The results of both studies indicate that less choice (i.e. more centralized) was associated with better performance for two key functions (inventory control and information systems), while more choice (i.e. more decentralized) over planning and budgeting was associated with better performance. With different systems of procurement in Ghana and Guatemala, we found that a system with some elements of procurement that are centralized (selection of firms and prices fixed by national tender) was positively related in Guatemala but negatively related in Ghana, where a system of 'cash and carry' cost recovery allowed more local choice. The authors conclude that logistics systems can be effectively
Quantitative workflow based on NN for weighting criteria in landfill suitability mapping
NASA Astrophysics Data System (ADS)
Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul
2017-10-01
Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.
Ertan, Aylin; Karacal, Humeyra
2008-10-01
To compare accuracy of LASIK flap and INTACS centration following femtosecond laser application in normal and keratoconic eyes. This is a retrospective case series comprising 133 eyes of 128 patients referred for refractive surgery. All eyes were divided into two groups according to preoperative diagnosis: group 1 (LASIK group) comprised 74 normal eyes of 72 patients undergoing LASIK with a femtosecond laser (IntraLase), and group 2 (INTACS group) consisted of 59 eyes of 39 patients with keratoconus for whom INTACS were implanted using a femtosecond laser (IntraLase). Decentration of the LASIK flap and INTACS was analyzed using Pentacam. Temporal decentration was 612.56 +/- 384.24 microm (range: 30 to 2120 microm) in the LASIK group and 788.33 +/- 500.34 microm (range: 30 to 2450 microm) in the INTACS group. A statistically significant difference was noted between the groups in terms of decentration (P < .05). Regression analysis showed that the amount of decentration of the LASIK flap and INTACS correlated with the central corneal thickness in the LASIK group and preoperative sphere and cylinder in the INTACS group, respectively. Decentration with the IntraLase occurred in most cases, especially in keratoconic eyes. The applanation performed for centralization during IntraLase application may flatten and shift the pupil center, and thus cause decentralization of the LASIK flap and INTACS. Central corneal thickness in the LASIK group and preoperative sphere and cylinder in the INTACS group proved to be statistically significant parameters associated with decentration.
A Community-Driven Workflow Recommendations and Reuse Infrastructure
NASA Astrophysics Data System (ADS)
Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.
2013-12-01
Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from
Empowerment or Impediment? School Governance in the School-Based Management Era in Hong Kong
ERIC Educational Resources Information Center
Kwan, Paula; Li, Benjamin Yuet-man
2015-01-01
Following the international trend in education towards democracy and decentralization, the Hong Kong government introduced a school-based management (SBM) system about two decades ago. It is widely recognized in the literature that decentralization, empowering school level management and marginalizing the influence of the intermediate level of…
A network approach to decentralized coordination of energy production-consumption grids
Arenas, Alex
2018-01-01
Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems. PMID:29364962
A network approach to decentralized coordination of energy production-consumption grids.
Omodei, Elisa; Arenas, Alex
2018-01-01
Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems.
Morisawa, Hiraku; Hirota, Mikako; Toda, Tosifusa
2006-01-01
Background In the post-genome era, most research scientists working in the field of proteomics are confronted with difficulties in management of large volumes of data, which they are required to keep in formats suitable for subsequent data mining. Therefore, a well-developed open source laboratory information management system (LIMS) should be available for their proteomics research studies. Results We developed an open source LIMS appropriately customized for 2-D gel electrophoresis-based proteomics workflow. The main features of its design are compactness, flexibility and connectivity to public databases. It supports the handling of data imported from mass spectrometry software and 2-D gel image analysis software. The LIMS is equipped with the same input interface for 2-D gel information as a clickable map on public 2DPAGE databases. The LIMS allows researchers to follow their own experimental procedures by reviewing the illustrations of 2-D gel maps and well layouts on the digestion plates and MS sample plates. Conclusion Our new open source LIMS is now available as a basic model for proteome informatics, and is accessible for further improvement. We hope that many research scientists working in the field of proteomics will evaluate our LIMS and suggest ways in which it can be improved. PMID:17018156
Facilitating hydrological data analysis workflows in R: the RHydro package
NASA Astrophysics Data System (ADS)
Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik
2015-04-01
The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges
PARTICIPATORY STORM WATER MANAGEMENT AND SUSTAINABILITY – WHAT ARE THE CONNECTIONS?
Urban stormwater is typically conveyed to centralized infrastructure, and there is great potential for reducing stormwater runoff quantity through decentralization. For areas which are already developed, decentralization of stormwater management involves private property and poss...
Tediosi, Fabrizio; Gabriele, Stefania; Longo, Francesco
2009-05-01
In many European countries, since the World War II, there has been a trend towards decentralization of health policy to lower levels of governments, while more recently there have been re-centralization processes. Whether re-centralization will be the new paradigm of European health policy or not is difficult to say. In the Italian National Health Service (SSN) decentralization raised two related questions that might be interesting for the international debate on decentralization in health care: (a) what sort of regulatory framework and institutional balances are required to govern decentralization in health care in a heterogeneous country under tough budget constraints? (b) how can it be ensured that the most advanced parts of the country remain committed to solidarity, supporting the weakest ones? To address these questions this article describes the recent trends in SSN funding and expenditure, it reviews the strategy adopted by the Italian government for governing the decentralization process and discusses the findings to draw policy conclusions. The main lessons emerging from this experience are that: (1) when the differences in administrative and policy skills, in socio-economic standards and social capital are wide, decentralization may lead to undesirable divergent evolution paths; (2) even in decentralized systems, the role of the Central government can be very important to contain health expenditure; (3) a strong governance of the Central government may help and not hinder the enforcement of decentralization; and (4) supporting the weakest Regions and maintaining inter-regional solidarity is hard but possible. In Italy, despite an increasing role of the Central government in steering the SSN, the pattern of regional decentralization of health sector decision making does not seem at risk. Nevertheless, the Italian case confirms the complexity of decentralization and re-centralization processes that sometimes can be paradoxically reinforcing each other.
The VERCE platform: Enabling Computational Seismology via Streaming Workflows and Science Gateways
NASA Astrophysics Data System (ADS)
Spinuso, Alessandro; Filgueira, Rosa; Krause, Amrey; Matser, Jonas; Casarotti, Emanuele; Magnoni, Federica; Gemund, Andre; Frobert, Laurent; Krischer, Lion; Atkinson, Malcolm
2015-04-01
, the system collects provenance data adopting the W3C-PROV data model. Provenance recordings can be explored and analysed at run time for rapid diagnostic and workflow steering, or later for further validation and comparisons across runs. We will illustrate the interactive services of the gateway and the capabilities of the produced metadata, coupled with the VERCE data management layer based on iRODS. The Cross-Correlation workflow was evaluated on SuperMUC, a supercomputing cluster at the Leibniz Supercomputing Centre in Munich, with 155,656 processor cores in 9400 compute nodes. SuperMUC is based on the Intel Xeon architecture consisting of 18 Thin Node Islands and one Fat Node Island. This work has only had access to the Thin Node Islands, which contain Sandy Bridge nodes, each having 16 cores and 32 GB of memory. In the evaluations we used 1000 stations, and we applied two types of methods (whiten and non-whiten) for pre-processing the data. The workflow was tested on a varying number of cores (16, 32, 64, 128, and 256 cores) using the MPI mapping of Dispel4Py. The results show that Dispel4Py is able to improve the performance by increasing the number of cores without changing the description of the workflow.
Decentralized DC Microgrid Monitoring and Optimization via Primary Control Perturbations
NASA Astrophysics Data System (ADS)
Angjelichinoski, Marko; Scaglione, Anna; Popovski, Petar; Stefanovic, Cedomir
2018-06-01
We treat the emerging power systems with direct current (DC) MicroGrids, characterized with high penetration of power electronic converters. We rely on the power electronics to propose a decentralized solution for autonomous learning of and adaptation to the operating conditions of the DC Mirogrids; the goal is to eliminate the need to rely on an external communication system for such purpose. The solution works within the primary droop control loops and uses only local bus voltage measurements. Each controller is able to estimate (i) the generation capacities of power sources, (ii) the load demands, and (iii) the conductances of the distribution lines. To define a well-conditioned estimation problem, we employ decentralized strategy where the primary droop controllers temporarily switch between operating points in a coordinated manner, following amplitude-modulated training sequences. We study the use of the estimator in a decentralized solution of the Optimal Economic Dispatch problem. The evaluations confirm the usefulness of the proposed solution for autonomous MicroGrid operation.
Web-video-mining-supported workflow modeling for laparoscopic surgeries.
Liu, Rui; Zhang, Xiaoli; Zhang, Hao
2016-11-01
As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.
Sheriff, R; Banks, A
2001-01-01
Organization change efforts have led to critically examining the structure of education and development departments within hospitals. This qualitative study evaluated an education and development model in an academic health sciences center. The model combines centralization and decentralization. The study results can be used by staff development educators and administrators when organization structure is questioned. This particular model maximizes the benefits and minimizes the limitations of centralized and decentralized structures.
Cost analysis of a managed care decentralized outpatient pharmacy anticoagulation service.
Anderson, Robert J
2004-01-01
To determine the per-patient-per-month (PPPM) cost of a decentralized outpatient pharmacy anticoagulation service (OPAS) in patients with chronic atrial fibrillation (AF) who were maintained on warfarin sodium therapy in a managed care setting, to compare the annual costs versus the risk for stroke, and to assess the quality of the anticoagulant management. Data were collected retrospectively from clinical, research, and administrative claims databases. Patient demographic data were stratified to include age and risk factors for stroke. Inclusion criteria for the study were adult patients (>18 years) who were maintained on chronic warfarin therapy with a diagnosis of AF (diagnosis code 427.31) and continuously enrolled during calendar year 2000. The cost analysis included the personnel cost of clinical pharmacy specialists, direct and indirect cost of laboratory tests for international normalized ratios (INR), and anticoagulant (warfarin plus bridge therapy with a low molecular weight heparin) drug cost and dispensing fee. The percentage of INR values within or near target was used to evaluate the effectiveness of the service. A total of 97 patients on chronic warfarin therapy for AF were identified for cost analysis. The demographics for these patients included the following: 71% were male, with 32% of the patients over the age of 75 years, and 60% had 1 or more identifiable risk factors for stroke. Utilizing established criteria, 80.4% of the sample was considered to be at high risk for ischemic stroke. A majority of the patients (94.8%) had nonvalvular disease, with an INR goal in the range of 2 to 3 in 91.8% of the cases. The PPPM cost for the OPAS monitoring service was $51.25, distributed as $13.78 (27%) in personnel costs for monitoring pharmacists, $18.38 (36%) for lab tests, and $19.09 (37%) for anticoagulant drug costs. These costs did not significantly differ among patient groups with various risks for ischemic stroke. For nonvalvular AF patients, the
Decentralized stabilization of semi-active vibrating structures
NASA Astrophysics Data System (ADS)
Pisarski, Dominik
2018-02-01
A novel method of decentralized structural vibration control is presented. The control is assumed to be realized by a semi-active device. The objective is to stabilize a vibrating system with the optimal rates of decrease of the energy. The controller relies on an easily implemented decentralized switched state-feedback control law. It uses a set of communication channels to exchange the state information between the neighboring subcontrollers. The performance of the designed method is validated by means of numerical experiments performed for a double cantilever system equipped with a set of elastomers with controlled viscoelastic properties. In terms of the assumed objectives, the proposed control strategy significantly outperforms the passive damping cases and is competitive with a standard centralized control. The presented methodology can be applied to a class of bilinear control systems concerned with smart structural elements.
Optimizing high performance computing workflow for protein functional annotation.
Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene
2014-09-10
Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.
Optimizing high performance computing workflow for protein functional annotation
Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene
2014-01-01
Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296
Using Kepler for Tool Integration in Microarray Analysis Workflows.
Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C
Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.
Vishwanath, Arun; Singh, Sandeep Rajan; Winkelstein, Peter
2010-11-01
The promise of the electronic medical record (EMR) lies in its ability to reduce the costs of health care delivery and improve the overall quality of care--a promise that is realized through major changes in workflows within the health care organization. Yet little systematic information exists about the workflow effects of EMRs. Moreover, some of the research to-date points to reduced satisfaction among physicians after implementation of the EMR and increased time, i.e., negative workflow effects. A better understanding of the impact of the EMR on workflows is, hence, vital to understanding what the technology really does offer that is new and unique. (i) To empirically develop a physician centric conceptual model of the workflow effects of EMRs; (ii) To use the model to understand the antecedents to the physicians' workflow expectation from the new EMR; (iii) To track physicians' satisfaction overtime, 3 months and 20 months after implementation of the EMR; (iv) To explore the impact of technology learning curves on physicians' reported satisfaction levels. The current research uses the mixed-method technique of concept mapping to empirically develop the conceptual model of an EMR's workflow effects. The model is then used within a controlled study to track physician expectations from a new EMR system as well as their assessments of the EMR's performance 3 months and 20 months after implementation. The research tracks the actual implementation of a new EMR within the outpatient clinics of a large northeastern research hospital. The pre-implementation survey netted 20 physician responses; post-implementation Time 1 survey netted 22 responses, and Time 2 survey netted 26 physician responses. The implementation of the actual EMR served as the intervention. Since the study was conducted within the same setting and tracked a homogenous group of respondents, the overall study design ensured against extraneous influences on the results. Outcome measures were derived
Applications of process improvement techniques to improve workflow in abdominal imaging.
Tamm, Eric Peter
2016-03-01
Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.
NASA Astrophysics Data System (ADS)
Heydari, Jafar; Norouzinasab, Yousef
2015-12-01
In this paper, a discount model is proposed to coordinate pricing and ordering decisions in a two-echelon supply chain (SC). Demand is stochastic and price sensitive while lead times are fixed. Decentralized decision making where downstream decides on selling price and order size is investigated. Then, joint pricing and ordering decisions are extracted where both members act as a single entity aim to maximize whole SC profit. Finally, a coordination mechanism based on quantity discount is proposed to coordinate both pricing and ordering decisions simultaneously. The proposed two-level discount policy can be characterized from two aspects: (1) marketing viewpoint: a retail price discount to increase the demand, and (2) operations management viewpoint: a wholesale price discount to induce the retailer to adjust its order quantity and selling price jointly. Results of numerical experiments demonstrate that the proposed policy is suitable to coordinate SC and improve the profitability of SC as well as all SC members in comparison with decentralized decision making.
Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics
Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe
2015-01-01
Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831
A practical workflow for making anatomical atlases for biological research.
Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles
2012-01-01
The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.
The Paradox of Decentralizing Schools: Lessons from Business, Government, and the Catholic Church.
ERIC Educational Resources Information Center
Murphy, Jerome T.
1989-01-01
By the year 2000, school decentralization could become another unfortunate, ineffectual pendulum swing. According to this article, a dynamic, ever-changing system of decentralization and centralization balances the benefits of local administrative autonomy with the pursuit of unified goals and helps each leadership level understand its…
Project management training : final report.
DOT National Transportation Integrated Search
2011-01-01
In 2005 the Indiana Department of Transportation (INDOT) went through a complete reorganization of its operations going from centralized to decentralized (District) management. This reorganization gave Districts autonomy to manage construction projec...
Project management training : [technical summary].
DOT National Transportation Integrated Search
2011-01-01
In 2005, the Indiana Department of Transportation (INDOT) went through a complete reorganization of its operations going from centralized to decentralized (District) management. This reorganization gave Districts autonomy to manage construction proje...
Text mining for the biocuration workflow.
Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G
2012-01-01
Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.
Case mix management education in a Canadian hospital.
Moffat, M; Prociw, M
1992-01-01
The Sunnybrook Health Science Centre's matrix organization model includes a traditional departmental structure, a strategic program-based structure and a case management-based structure--the Clinical Unit structure. The Clinical Unit structure allows the centre to give responsibility for the management of case mix and volume to decentralized Clinical Unit teams, each of which manages its own budget. To train physicians and nurses in their respective roles of Medical Unit directors and Nursing Unit directors, Sunnybrook designed unique short courses on financial management and budgeting, and case-costing and case mix management. This paper discusses how these courses were organized, details their contents and explains how they fit into Sunnybrook's program of decentralized management.
Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Logan Timothy; Hackenberg, Robert Errol
These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneckmore » Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.« less
A scientific workflow framework for (13)C metabolic flux analysis.
Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina
2016-08-20
Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.
Decentralized control experiments on the JPL flexible spacecraft
NASA Technical Reports Server (NTRS)
Ozguner, U.; Ossman, K.; Donne, J.; Boesch, M.; Ahmed, A.
1990-01-01
Decentralized control experiments were successfully demonstrated for the JPL/AFAL Flexible Structure. A simulation package using MATRIXx showed strong correlation between the simulations and experimental result, while providing a means for test and debug of the various control strategies. Implementation was simplified by a modular software design that was easily transported from the simulation environment to the experimental environment. Control designs worked well for suppression of the dominant modes of the structure. Static decentralized output feedback dampened the excited modes of the structure, but sometimes excited higher order modes upon startup of the controller. A second-order frequency shaping controller helped to eliminate excitation of the higher order modes by attenuating high frequencies in the control effort. However, it also resulted in slightly longer settling times.
Decentralized regulation of dynamic systems. [for controlling large scale linear systems
NASA Technical Reports Server (NTRS)
Chu, K. C.
1975-01-01
A special class of decentralized control problem is discussed in which the objectives of the control agents are to steer the state of the system to desired levels. Each agent is concerned about certain aspects of the state of the entire system. The state and control equations are given for linear time-invariant systems. Stability and coordination, and the optimization of decentralized control are analyzed, and the information structure design is presented.
Real-Time Electronic Dashboard Technology and Its Use to Improve Pediatric Radiology Workflow.
Shailam, Randheer; Botwin, Ariel; Stout, Markus; Gee, Michael S
The purpose of our study was to create a real-time electronic dashboard in the pediatric radiology reading room providing a visual display of updated information regarding scheduled and in-progress radiology examinations that could help radiologists to improve clinical workflow and efficiency. To accomplish this, a script was set up to automatically send real-time HL7 messages from the radiology information system (Epic Systems, Verona, WI) to an Iguana Interface engine, with relevant data regarding examinations stored in an SQL Server database for visual display on the dashboard. Implementation of an electronic dashboard in the reading room of a pediatric radiology academic practice has led to several improvements in clinical workflow, including decreasing the time interval for radiologist protocol entry for computed tomography or magnetic resonance imaging examinations as well as fewer telephone calls related to unprotocoled examinations. Other advantages include enhanced ability of radiologists to anticipate and attend to examinations requiring radiologist monitoring or scanning, as well as to work with technologists and operations managers to optimize scheduling in radiology resources. We foresee increased utilization of electronic dashboard technology in the future as a method to improve radiology workflow and quality of patient care. Copyright © 2017 Elsevier Inc. All rights reserved.
Decentralized Estimation and Control for Preserving the Strong Connectivity of Directed Graphs.
Sabattini, Lorenzo; Secchi, Cristian; Chopra, Nikhil
2015-10-01
In order to accomplish cooperative tasks, decentralized systems are required to communicate among each other. Thus, maintaining the connectivity of the communication graph is a fundamental issue. Connectivity maintenance has been extensively studied in the last few years, but generally considering undirected communication graphs. In this paper, we introduce a decentralized control and estimation strategy to maintain the strong connectivity property of directed communication graphs. In particular, we introduce a hierarchical estimation procedure that implements power iteration in a decentralized manner, exploiting an algorithm for balancing strongly connected directed graphs. The output of the estimation system is then utilized for guaranteeing preservation of the strong connectivity property. The control strategy is validated by means of analytical proofs and simulation results.
Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment
Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan
2016-01-01
Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058
Public Managers Should Be Proactive
ERIC Educational Resources Information Center
Carlson, Thomas S.
1976-01-01
Future public managers should be proactive by creating management processes before problems arise. Planning prevents reactive or crisis managing. Future managers should also be prepared to meet dilemmas and paradoxes such as centralization versus decentralization of decision-making and work processes, politics versus administration dichotomy, and…
Oleribe, Obinna Ositadimma; Oladipo, Olabisi Abiodun; Ezieme, Iheaka Paul; Crossey, Mary Margaret Elizabeth; Taylor-Robinson, Simon David
2016-01-01
Access to quality care is essential for improved health outcomes. Decentralization improves access to healthcare services at lower levels of care, but it does not dismantle structural, funding and programming restrictions to access, resulting in inequity and inequality in population health. Unlike decentralization, Commonization Model of care reduces health inequalities and inequity, dismantles structural, funding and other program related obstacles to population health. Excellence and Friends Management Care Center (EFMC) using Commonization Model (CM), fully integrated HIV services into core health services in 121 supported facilities. This initiative improved access to care, treatment, support services, reduced stigmatization/discrimination, and improved uptake of HTC. We call on governments to adequately finance CM for health systems restructuring towards better health outcomes.
A Workflow to Investigate Exposure and Pharmacokinetic ...
Background: Adverse outcome pathways (AOPs) link adverse effects in individuals or populations to a molecular initiating event (MIE) that can be quantified using in vitro methods. Practical application of AOPs in chemical-specific risk assessment requires incorporation of knowledge on exposure, along with absorption, distribution, metabolism, and excretion (ADME) properties of chemicals.Objectives: We developed a conceptual workflow to examine exposure and ADME properties in relation to an MIE. The utility of this workflow was evaluated using a previously established AOP, acetylcholinesterase (AChE) inhibition.Methods: Thirty chemicals found to inhibit human AChE in the ToxCast™ assay were examined with respect to their exposure, absorption potential, and ability to cross the blood–brain barrier (BBB). Structures of active chemicals were compared against structures of 1,029 inactive chemicals to detect possible parent compounds that might have active metabolites.Results: Application of the workflow screened 10 “low-priority” chemicals of 30 active chemicals. Fifty-two of the 1,029 inactive chemicals exhibited a similarity threshold of ≥ 75% with their nearest active neighbors. Of these 52 compounds, 30 were excluded due to poor absorption or distribution. The remaining 22 compounds may inhibit AChE in vivo either directly or as a result of metabolic activation.Conclusions: The incorporation of exposure and ADME properties into the conceptual workflow e
Centralization vs. decentralization in medical school libraries.
Crawford, H
1966-07-01
Does the medical school library in the United States operate more commonly under the university library or the medical school administration? University-connected medical school libraries were asked to indicate (a) the source of their budgets, whether from the central library or the medical school, and (b) the responsibility for their acquisitions and cataloging. Returns received from sixtyeight of the seventy eligible institutions showed decentralization to be much the most common: 71 percent of the libraries are funded by their medical schools; 79 percent are responsible for their own acquisitions and processing. The factor most often associated with centralization of both budget and operation is public ownership. Decentralization is associated with service to one or two rather than three or more professional schools. Location of the medical school in a different city from the university is highly favorable to autonomy. Other factors associated with these trends are discussed.
Centralization vs. Decentralization in Medical School Libraries
Crawford, Helen
1966-01-01
Does the medical school library in the United States operate more commonly under the university library or the medical school administration? University-connected medical school libraries were asked to indicate (a) the source of their budgets, whether from the central library or the medical school, and (b) the responsibility for their acquisitions and cataloging. Returns received from sixtyeight of the seventy eligible institutions showed decentralization to be much the most common: 71 percent of the libraries are funded by their medical schools; 79 percent are responsible for their own acquisitions and processing. The factor most often associated with centralization of both budget and operation is public ownership. Decentralization is associated with service to one or two rather than three or more professional schools. Location of the medical school in a different city from the university is highly favorable to autonomy. Other factors associated with these trends are discussed. PMID:5945568
Hayashi, K.; Hayashi, H.; Nakao, F.; Hayashi, F.
2001-01-01
AIM—To prospectively investigate changes in the area of the anterior capsule opening, and intraocular lens (IOL) decentration and tilt after implantation of a hydrogel IOL. METHODS—100 patients underwent implantation of a hydrogel IOL in one eye and an acrylic IOL implantation in the opposite eye. The area of the anterior capsule opening, and the degree of IOL decentration and tilt were measured using the Scheimpflug videophotography system at 3 days, and at 1, 3, and 6 months postoperatively. RESULTS—The mean anterior capsule opening area decreased significantly in both groups. At 6 months postoperatively, the area in the hydrogel group was significantly smaller than that in the acrylic group. The mean percentage of the area reduction in the hydrogel group was also significantly greater than that in the acrylic group, being 16.9% in the hydrogel group and 8.8% in the acrylic group. In contrast, IOL decentration and tilt did not progress in either group. No significant differences were found in the degree of IOL decentration and tilt throughout the follow up period. CONCLUSIONS—Contraction of the anterior capsule opening was more extensive with the hydrogel IOL than with the acrylic IOL, but the degree of IOL decentration and tilt were similar for the two types of lenses studied. PMID:11673291
Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics.
Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A; Caron, Christophe
2015-05-01
The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). contact@workflow4metabolomics.org. © The Author 2014. Published by Oxford University Press.
A note on decentralized integral controllability
NASA Technical Reports Server (NTRS)
Nwokah, O. D. I.; Frazho, A. E.; Le, D. K.
1993-01-01
A concept of decentralized integral controllability (DIC) defined on a given gain space Phi is clarified and related to the original definition given by Morari and Zafirou (1989). This leads to a simple proof of the existence of DIC on Phi from which existence conditions for DIC can be routinely deduced in the sense of Morari and Zafirou.
Scientific Workflows and the Sensor Web for Virtual Environmental Observatories
NASA Astrophysics Data System (ADS)
Simonis, I.; Vahed, A.
2008-12-01
interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.
Hamed, Kaveh Akbari; Gregg, Robert D
2016-07-01
This paper presents a systematic algorithm to design time-invariant decentralized feedback controllers to exponentially stabilize periodic orbits for a class of hybrid dynamical systems arising from bipedal walking. The algorithm assumes a class of parameterized and nonlinear decentralized feedback controllers which coordinate lower-dimensional hybrid subsystems based on a common phasing variable. The exponential stabilization problem is translated into an iterative sequence of optimization problems involving bilinear and linear matrix inequalities, which can be easily solved with available software packages. A set of sufficient conditions for the convergence of the iterative algorithm to a stabilizing decentralized feedback control solution is presented. The power of the algorithm is demonstrated by designing a set of local nonlinear controllers that cooperatively produce stable walking for a 3D autonomous biped with 9 degrees of freedom, 3 degrees of underactuation, and a decentralization scheme motivated by amputee locomotion with a transpelvic prosthetic leg.
Gaulke, L S; Borgford-Parnell, J L; Stensel, H D
2008-01-01
This paper reports on the design, implementation, and results of a course focused on decentralized and onsite wastewater treatment in global contexts. Problem-based learning was the primary pedagogical method, with which students tackled real-world problems and designed systems to meet the needs of diverse populations. Both learning and course evaluations demonstrated that the course was successful in fulfilling learning objectives, increasing student design skills, and raising awareness of global applications. Based on this experience a list of recommendations was created for co-developing and team-teaching multidisciplinary design courses. These recommendations include ideas for aligning student and teacher goals, overcoming barriers to effective group-work, and imbedding continuous course assessments. Copyright IWA Publishing 2008.
Robust Decentralized Nonlinear Control for a Twin Rotor MIMO System
Belmonte, Lidia María; Morales, Rafael; Fernández-Caballero, Antonio; Somolinos, José Andrés
2016-01-01
This article presents the design of a novel decentralized nonlinear multivariate control scheme for an underactuated, nonlinear and multivariate laboratory helicopter denominated the twin rotor MIMO system (TRMS). The TRMS is characterized by a coupling effect between rotor dynamics and the body of the model, which is due to the action-reaction principle originated in the acceleration and deceleration of the motor-propeller groups. The proposed controller is composed of two nested loops that are utilized to achieve stabilization and precise trajectory tracking tasks for the controlled position of the generalized coordinates of the TRMS. The nonlinear internal loop is used to control the electrical dynamics of the platform, and the nonlinear external loop allows the platform to be perfectly stabilized and positioned in space. Finally, we illustrate the theoretical control developments with a set of experiments in order to verify the effectiveness of the proposed nonlinear decentralized feedback controller, in which a comparative study with other controllers is performed, illustrating the excellent performance of the proposed robust decentralized control scheme in both stabilization and trajectory tracking tasks. PMID:27472338
Robust Decentralized Nonlinear Control for a Twin Rotor MIMO System.
Belmonte, Lidia María; Morales, Rafael; Fernández-Caballero, Antonio; Somolinos, José Andrés
2016-07-27
This article presents the design of a novel decentralized nonlinear multivariate control scheme for an underactuated, nonlinear and multivariate laboratory helicopter denominated the twin rotor MIMO system (TRMS). The TRMS is characterized by a coupling effect between rotor dynamics and the body of the model, which is due to the action-reaction principle originated in the acceleration and deceleration of the motor-propeller groups. The proposed controller is composed of two nested loops that are utilized to achieve stabilization and precise trajectory tracking tasks for the controlled position of the generalized coordinates of the TRMS. The nonlinear internal loop is used to control the electrical dynamics of the platform, and the nonlinear external loop allows the platform to be perfectly stabilized and positioned in space. Finally, we illustrate the theoretical control developments with a set of experiments in order to verify the effectiveness of the proposed nonlinear decentralized feedback controller, in which a comparative study with other controllers is performed, illustrating the excellent performance of the proposed robust decentralized control scheme in both stabilization and trajectory tracking tasks.